Monday, November 16, 2015

The SMAC stack

Technology changes with time, but as the French say, the more things change, the more they stay the same. Recently, four technologies are gaining wider traction and acceptance in markets - both the stock market, where companies are valued and shareholders (and ahem, Wall Street) make money, as well as the companies themselves, which are able to sell more product and reap rich rewards for their employees while truly delighting their customers with cheery new offerings. So let's get on with it, the SMAC stack - technologies that will rule the roost into the near to mid-term future.

Social: the rise of Facebook, Twitter, and related companies has been meteoric, and it has been incredibly difficult for even established large players in related domains (Google+ anyone?) to take market-share from them.

Mobile: in the days of yore, Bell Labs used to be one of the greatest (if not _the_greatest_) places where advanced technologies were incubated. In fact, that's where mobile phone technology first came from... and they worked on some new (at the time) realizations of specialized services for location, collaboration, content delivery etc to the handset - using industry standard APIs like Parlay and OSA. In today's world, sadly Bell Labs doesn't exist in any form close to its former glory anymore, and most of the (good) people moved on to either academia, Google, Microsoft Research, Wall Street, or other such places, but mobile still rocks. Attempts to standardize APIs seem to have failed, and who needs them anyway when companies can build proprietary solutions in walled gardens, or gradually open up walled gardens to gain market-share, each with their own APIs?

Analytics: care to extract some actionable intelligence from huge data sets? what about making data-driven (as opposed to whimsical) decisions to advance a corporate cause? Well, analytics is your thing then. With the greater proliferation of machine learning, improved (and improving) algorithms, massive parallelization, larger more efficient means of accessing large data sets, analytics will continue to grow. Be warned though, as Andrew Lo says "if you torture a data set long enough, it will confess to anything."

Cloud: you got to keep your content somewhere. Cloud anyone? First there was Java with WORA (write once, run anywhere), but that is so late 90s now, with virtualization becoming much more commonplace. What really makes things go today is large amounts of data gathered from a large number of sensors. Earlier one would capture only data they knew they might need. Today the mantra seems to be "how do you know what you will need later? get all the data you can hold" -- and this leads to a massive "data-base in the sky", a cloud if you will, where you can write once, and access from anywhere. Cool, huh?

"Micro-analytics" - a budding field - don't collect all the data in the world (in some cases, you are given the data set and don't have the luxury to get more data from historical sources), try using the data you have to extract the best intelligence you can from it. Admittedly in many cases the inferences you can draw will be limited by the data you have. In others though, the focus on the smaller data set might actually lead to a sharper focus on the variables captured and better results. In machine learning, as with all else, it comes down to what data you have or collect, what methods you know, and most importantly, how you use the information at your disposal. As they say, "it is always up to the operator".