This is the age of data. According to estimates made by IBM in 2017, around 2.5 quintillion bytes of data are being created each day. That is 2.5 times a billion times a billion bytes of data per day – enough to fill any imaginable space.
The problem is just not storage though. It is simply not possible to go through each and every unit of data in a finite amount of time because of – among other things – the magnitude of the scale.
The problem thus is to make sense of the data by employing intelligent techniques to decipher it in order to understand the underlying patterns that lie hidden behind these billions of bytes; in order to draw insights – concrete as well as abstract; qualitative as well as quantitative; to visualize and to create predictions.
Banks and financial institutions depend their existence on this. Marketers desperately seek it out. HR departments, social scientists, security mavens, manufacturing units, travel portals, astronomers, AI researchers, global navigation systems, insurance companies, public health departments – all jostle up to get a peek into patterns to understand and act meaningfully.
This is where Analytics comes into play. According to the definition provided by Wikipedia, Analytics is the discovery, interpretation, and communication of meaningful patterns in data.
We have long been in the analytics business.
Intelligence agencies such as the police force and defense organizations need to analyze data to understand among other things location of miscreants and terrorists. They also like to have at their disposal, systems which could help them predict areas, or dates on which there could be impending trouble. We have built software systems that analyze call data records fetched from telecom companies. This data, coupled with historical data of events and criminals allow effective plotting of the networks of probable activities and events. RapidMiner, Neo4j and other tools have been extensively used to bring about such analysis in the hands of the intelligence analysts.
Data centers, BPOs and similar data processing units utilize lots of personal computers for their day-to-day transactions. Many such computers unfortunately remain up and running beyond their scheduled work routine. This results in huge amount of wastage of power. We have built systems that monitor idle computers on networks and based on an adaptive algorithm enable analysis of power usage and based on preset policies shut them down or bring them up.
For sensors on oil-rigs, sending hundreds of bytes of data per second per sensor is common. We have worked on building software that capture this data, process them and display graphics on desktop and web based applications. This enables geo-scientists to closely monitor the oil excavation process and accordingly decide on the path in which drilling operations occur. This ensures proper utilization of billions of dollars of investments.
We have more than 200 man-years of experience on this domain.