Did you know there is an official LinkedIn archive called engineering.linkedin.com/blog?

In this blog is the truth about how LinkedIn actually views its User base.

Here are my synopsis for just a couple of the posts to be found there. Prepare to be shocked.

I first reviewed:
https://engineering.linkedin.com/blog/2015/10/how-experimentation-helped-linkedin-improve-email-communication
Key points:
  • A LinkedIn experimentation and optimization computer program exists called XLNT "which keeps track of what experiments and variants are running for all of LinkedIn".
  • When a change to the LinkedIn software is made, XLNT is used to perform so called A/B testing, which is first carried out on a small random sample of users to see what effect the change has.
  • The effect is measured by "suitable" chosen criteria. For example, in this particular blog post, the author considers changes to emails sent out by LinkedIn. The criteria chosen were the benefits of more reads and clicks of the emails they send out vs the downsides of the number of emails flagged as spam.
  • Before launching a change site wide, it is often first rolled out only to subgroups of users for further testing. The examples given is using only active (whatever that means) users or only users in a certain georegion.
  • The author claims that "This combination of optimization and experimentation has been highly effective. So far this year, it has allowed LinkedIn to send 40 percent less email and reduce email complaints by half, with a positive member response. In addition, this has led to an increase in email engagement rates".
Analysis: LinkedIn perform live experiments on users and user accounts. Users are not notified that they are part of experiments. LinkedIn phase changes in by georegions and activity, intentionally. T

LinkedIn have an optimization/experimentation program. This appears in itself to be a commercially viable product or service. By proving its worth on its own users, LinkedIn may be looking to monetize it for other large website developers. So continually rolling out changes to its own platform has commercial benefits for LinkedIn which have very little to do with the needs of the users. Change and testing change is therefore part and parcel of LinkedIn monetizations of its users. In other words, we can expect continual changes into the future, because it is a way for them to demonstrate the value of part of their technology to others.

Another blog post on this theme
https://engineering.linkedin.com/blog/2015/10/fine-tuning-premium-products-through-a-b-testing
considers the use of XLNT for Premium products.
Key points:
  • LinkedIn "knows" that Premium users were "overwhelmed" by the number of versions of Premium accounts which were on offer.
  • LinkedIn have reduced the number of packages from 12 to 4.
  • These changes required a major overhaul of the LinkedIn software.
  • "While we were excited about the product changes, we wanted to be mindful of the business impact a large change like this could bring. Premium subscriptions is a large business that accounted for 18% of LinkedIn’s revenue in Q2 2015".
  • The XLNT software was used to test the impact of this change, the success was measured by which version (12 or 4 products) produced more sign ups.
  • The tests, however, involved other changes too, such as the inclusion of a FAQ page.
  • "Prior to launch, we debated endlessly on the size or [clickable buttons]. There was a strong belief that a bolder [buttons] will drive more sign-ups. Based on feedback from our design team, we decided to test the bold [button] against a more subtle one. The design team won this contest - the subtle [button] performed better and resulted in an improved acquisition flow. We quickly dropped the big blue button!"
  • Simplification and iterating changes at a fast pace are identified as key to LinkedIn's plans.
Analysis: The only thing that LinkedIn have proven here is that 4 specific choices of Premium offering are better than 12 different specific packages at the time at which the A/B testing was done, when "better" is measured in terms of the number of sign ups. Thats all. This is unfortunately pseudo-science of the worse kind which we are being subjected to.

They have not shown or even measured whether this change has any adverse effects on other parts of the LinkedIn users experience. They have not shown whether a different choice of 4 packages would be better still. They have not shown that 1 or 2 or 6 or 10 or 8 packages are actually better than 4. Just that 4 are better than 12.

They have not measured the impact of any of the other changes to the platform on the sign up rate to Premium services. They have not gathered the evidence of whether different changes have effects on the Premium service sign up which might simply swamp the change from 12 to 4, making it largely irrevelant.

They have correlated increased sign ups with the user satisfaction of those services after they have signed up and found out what they are getting. These are far from the same thing. They may have come up with an improved marketing strategy, but this does not mean they have a better product from the users perspective.

They have not shown that when another major change in the user experience is made, that the outcome of the test will not be radically altered. That is to say, if other aspects of the environment were radically improved, that in fact, under those circumstances, the users might then welcome more choice for Premium products.

They have shown that their assumptions often turn out to be wrong when tested. Yet they have not run their XLNT program on their fundamental assumption. They have simply not tested the premise that overall simplification and a fast pace of change is in itself a good thing, through A/B testing. If they are wrong about these fundamentals then everything they are doing, everything they are putting users and their businesses through, is a complete and utter nonsense.

Most of all this demonstrates what we all already know. LinkedIn think of users not as people but as pixels. These blog posts show an utter contempt for users, down to the level that they think they can play with our businesses for the purposes of internal competitions and a bit of sport and then even joke about it.

It also shows that while they concerned with the effects on their own business and quarterly profits, the impacts on our businesses does not even come on their radar, is not even in their lexicon.
The key point about the "button" and "concerns for our profits" above are the ones I wish readers to reflect on the most.

If you like my posts about LinkedIn, may I ask for your support to highlight my e-Manual for Professional Profiles in 2016. I would very much appreciate your retweet of the embedded tweet below.