Tuesday, February 23, 2016

Big Data/Analytics: A Fertile Breeding Ground For Start-Ups?


Gartner 2016 Magic Quadrant for Advanced Analytics Platforms just touches the tip of the iceberg as far as the number of players in this burgeoning sector are concerned.
The start-up ecosystem is also pretty crowded in this space as many players emerge in different niches based on various horizontal and vertical technology and customer differentiation strategies. Nowhere is the action more visible than the IT Hub of Bengaluru, India.
I recently had an opportunity to meet with three of these.  
Tetrus Corporation headed by Sharad Rao focuses on data-mining and analytics from a US National Security perspective and so works closely with Public Safety, Homeland Security, Justice and Corrections communities. Sharad describes as their key strength the ability to work across traditional data siloes to provide security officials a holistic overview of the data and the ability to recognize threats/opportunities which otherwise may have stayed hidden in the seams. The company now strives to leverage some of the same strengths and capabilities in offerings targeted towards the private sector domain as well as other geographies.
Axtria has focused on the people/expertise aspect building a strong bench of data sciences and analytics expertise hiring some of the best graduates/PhDs from the most prestigious institutions in India/elsewhere. Axtria "Is A Data Sciences Company That Operates At The Intersection Of Deep Analytics, Domain expertise And Technology" is how Manish Mittal , managing principal described themselves.
Sigmoid founded by a bunch of bright sparks from IIT Kharagpur includingLokesh Anand and Mayur Rustagi has gained lot of traction in the retailing space with their SigView: An Integrated Solution, Built for Scale and Speed on Apache Spark. The use of an In-Memory Columnar database with some unique indexing algorithms provides them the ability to ingest large volumes of data and provide quick analytics. Mayur and Lokesh were gung-ho in describing future plans to expand their reach across more industry sectors in the US as well as globally. Also in their plans, launching SigAI: Self learning intelligence system on Apache Spark and Titan Graph.
I was impressed by the capabilities on display and am sure we will be hearing more about these companies in the days to come.
Once again though, this is just the tip of the iceberg...............
Also published on LinkedIn

Monday, February 8, 2016

3 Biggest Challenges in the "Internet of Things" Arena

Gartner's VP &  Fellow - Technology Visionary & Futurist David Cearley joined the conversation on one of my earlier posts about the Internet of Things: Digital Mesh Or Digital Goo? What will bind the IoT Ecosystem together? 
Call it the Internet of Things (IoT) or The Internet of Everything (IoE), players in this burgeoning field are facing a common set of challenges as I fathomed from a conversation with a leading vendor venturing into this space:
  • "Where are the Guinea Pigs?"  (Early Adopters): The biggest one is getting people willing to pay/invest in this area since most are not able to visualize the benefits of the analytics which can be generated or the operational efficiencies which can be realized using IoT/IoE. Vendors have an uphill climb trying to explain how they can help improve operational efficiency till they can find a stakeholder forward looking enough to make the small investments and allow the vendor access to at least a  subset of the environment to do a pilot and prove the use case. The vendors with proven success stories have a head-start.
  • "No Cookie-Cutters": Every environment is unique. A cookie-cutter, one size fits all approach cannot work. While a vendor can suggest their platform as a offering they still need to work with other partners to deliver an integrated solution. The vendors are finding themselves challenged to deliver services to the customers tailored to their unique and completely different environments. A big opportunity for strategic/implementation partners.
  • "Missing Sensors": The biggest technical challenge is not security as one would expect but the fact that lots of devices do not have the capabilities to capture the data/metrics to generate the required analytics. The vendors need to leverage sensor technology to add sensors to legacy devices to pull back requisite data esp. in the Manufacturing and Buildings industry sectors.
Are you working in the IoT/IoE space? What are some of the challenges you are facing? Please join the conversation.
Originally published on LinkedIn. You can join the conversation there.

Tuesday, December 22, 2015

PReDICTT: Engaging All Employees In The Innovation Mindset!


Yes, that's right. It's no typo. I mean PReDICTT. It's the acronym for a unique program we initiated  this year : Peers Reflecting on Developments in Current Technology Trends.
The idea behind it is simple. In our roles as technology individuals, we are consistently exposed to technology and innovation trends: some directly related to our work endeavors and some seemingly unrelated, and these make us wonder what their influence may be on us in the days to come.  Some examples Include: Bitcoins, Self-Driving cars, The Internet of Things, Millennials in the Workplace, etc.  We often wish for a forum where we could learn more about these trends and openly share our own ideas and thoughts with others in anopen, candid, non-judgmental fashion.
The PReDICTT Program (Peers Reflecting on Developments In Current Technology Trends) gave the team an opportunity to periodically engage in some creative brainstorming and sharing of ideas distinct from our day-to-day responsibilities. The opportunity to view the world through a telescope rather than the rearview-mirror or windscreen for a change. A key premise being that innovation thought  even unrelated to normal line of work makes individuals more effective in whatever they do. 
This program was comprised of “Open Mike” sessions at a regular cadence where volunteer/guest speakers shared their thoughts on a Technology/Innovation trend and its likely implications. The requirement for speakers was not that they be experts in the topic but that they feelpassionately enough about it to be willing to share their ideas related to it with their peers.
By this program, even as we bring “Innovation” thought to the fore with a regular cadence driving positive downstream impacts we also strove to:
  • Enhance Communications to employees / partners
  • Enhance Employee motivation and skills for presenters
  • Enhance image of our Information Management group with partners with forward looking topics
  • Provide opportunity to recognize individuals to step outside their primary job responsibilities
We had 8 very interesting sessions during the year spanning the gamut of innovations:
  • Security in the era of BYOD 
  • Steganography and Cryptography 
  • The Changing Forms of Money and What it Means for Us- Bitcoins et al 
  • Ubiquitous Internet: Balloons, Drones 
  • Jugaad- Frugal Innovation
  • Wearables
  • Autonomous Vehicles, Driverless Cars
  • Future of Computing
As intended, all presented by volunteers. What was even more stimulating was the excellent exchange of ideas following the presentation and which often spilled over into our vibrant Yammer group. Lots of thoughts on how these can impact what we do on a daily basis and our marketplace.
The PReDICTT keywords reiterated at every session are:
  • Share & Participate
  • Develop & Grow
  • Ideas & Innovation
  • Passion & Fun
2016 promises to be even more exciting for PReDICTT as more groups warm up to the idea and join; and even more Innovations keep appearing in the world we live in! 
So one of the trends I predict for 2016 is more companies launching initiatives like this to engage their employees in "Innovation Thinking", as I call it.
What about your company? What do you intend to do in 2016 to keep your employees intellectually stimulated and engaged even if their routine jobs may be relatively routine and mundane?

Wednesday, December 9, 2015

Google in the Time of Floods!


Many of you may not be aware of the incessant once in a hundred years kind of rains and consequent flooding that recently plagued the southern Indian coastal city of Chennai.
And then, maybe, many of you are aware since given the globalized nature of the world economy you or one of your vendors most likely has operations in Chennai(esp. if you are in the BPO, tech or automotive sectors).
One thing that struck me as sitting in the US, I interacted with friends who were marooned in the city was the big role played by social media - facebook, WhatsApp and the like in enabling people to stay connected and in ensuring that help and resources are directed to the most needy. On the flip side social media though helpful was also rife with rumors and misinformation. Internet connectivity was also a problem  as a sizable portion of the population there still relies on "wired" internet - the cable breaks and power outages did not help. 
A picture can say it better than thousand words! A missing piece of critical information for both the stranded people and the first responders is  real-time imagery of flood affected areas to comprehend which streets are open, which areas are submerged and how the floodwaters are moving. Google Maps can play a big role in addressing that gap. Currently I believe the satellite imagery or street-views in Google Maps are refreshed after long periods of time.In crisis situations like these the satellite imagery could be updated more frequently perhaps even near real-time so that users have the latest information. With the correct updated maps being available I am sure the user-community will crowd-source to tag them with relevant highlights - roads/bridges to avoid , drinking water stocking points , food stocks etc.
Of course Google will need to allocate some resources to ensuring that their satellites - "birds" (or those of their imagery providers) get positioned appropriately. Crises like these could be another opportunity to leverage Google'sLoon Project. Those stratospheric balloons instead of just enabling internet connectivity could also provide real-time imagery.
Rather than this being just charity, it would make sound business sense too:
  • Make Google's service more "sticky". People are not likely to forget the tools/technologies/companies which helped them in time of crisis.
  • People will continue to visit these sites, use these apps even during times of crisis when visits to other kinds of sites are likely to fall. So Google's quantum of "eyeballs" or "clicks" will not fall.
  • Insurance companies and the like would be willing to pay for access to more near real-time information.
Larry Page, Sergey Brin, Sundar Pichai: Can you hear me?

Pay-as-you-go IT: CFO’s Dream, CIO’s nightmare?


Co-Authors:   
Introduction
As the nature of business becomes more “digital” a pay-as- you-go Consumption based IT cost model is likely to emerge as a major disrupter to the traditional view of IT as a Capital Expenditure. The authors investigate some of the drivers for this shift; how it is likely to play out in the board room and the approaches which can be used to make this disruption a positive one for the corporation.
Synchronicity of IT Spend with Corporate Revenue
As we look at the IT–Business interaction from a strategic perspective esp. as it relates to the cost/funding dimension (manifested in the annual and long term budgeting exercises and periodic cost-reduction initiatives) we see one of the basic issues as “synchronicity” – How can IT spend be more closely aligned with the ebbs and flows of the business revenue stream?
In many large corporations this issue also impacts the IT long term vision of transitioning to a simplified, streamlined end-state architecture (vis-à-vis the convoluted mish-mash of legacy applications which generally exists) as thebusiness becomes increasingly wary of making the large investments required to drive the necessary changes as their own revenue projections fluctuate. In fact in some cases the large IT investments seem to be paying off at least initially but as time progresses the IT cost increase is far more disproportionate to the revenue increase.
Pay-as-you-go (Usage Model)
From a CFO perspective, then, a pay-as-you-go model seems very attractive. In the most simplistic configuration all applications will be hosted in the cloud with business users being charged based on the actual consumption of resources.
Rather than implementing any major platform initiative in-house with its associated fixed costs (Capex), technology vendors will be required to provide the desired capabilities as a consumption based service (Opex). Technology products if built in-house can be done so using the pay-as-you-go resources from a cloud vendor. These capabilities can also be leveraged to evaluate the feasibility of the product from a business perspective and to pilot risky projects without significant budget commitment. This shifts the initial fixed investment outlay into smaller outlay paid every month based on the usage.
Another key premise is that since the vendor will be leveraging economies of scale over a larger user base they will be able to drive the costs down significantly more than if the company would be able to do hosting applications in-house.
IT Supply Chain – Digital Products
The nature of the products and services offered by corporations are also dramatically changing with the “digital” component becoming a key part of the offering and value proposition for the customer. For example a traditional product may now be bundled with a digital service enabling customers to access and store content. IT processes and infrastructure will play a role in securing the digital files, storing them and enabling access to these files.
As a result the IT infrastructure component of the cost of the offering needs to have a more direct alignment to the total cost rather than just be an allocation of a fixed charge. In essence since IT infrastructure is essentially becoming the digital supply chain for the offering and as for any conventional product, the supply chain costs have to be in alignment with the revenue for the product.
The CEO’s Viewpoint
Couple this with the growing CEO perspective that most of IT spend is“non-differentiating for the business.” They also realize that as their business becomes more “digital” creating and securing their digital assets is critical to maintaining the corporate reputation. As IT spend is growing CEOs would like to demand more out of it. And they would definitely like more accountability and transparency about the burgeoning expenses. A granular Pay-as-you-go cost model creates more transparency in what the “non-differentiating” portions of IT are costing on a per transaction or a per user basis.
The CIO’s Dilemmas
Not many CIOs have climbed this bandwagon though. Some of their wariness comes about because of the security implications of moving everything off-premises and also from the viewpoint of looking at the historical investments in-house IT infrastructure and data-centers as a sunk cost. Economists would argue that the sunk-cost dilemma is moot, per the “bygones principal” only the "extra" or "marginal" costs and benefits of every decision need to be evaluated. Ideally the past costs should be ignored and the future costs and benefits taken into consideration when making such a decision: A hard-headed calculation of the extra costs one will incur and weighing them against extra advantages. But this is most easier said than done. Also, what many times gets ignored is that while the one-time cost of implementing an application or building an infrastructure may have been “sunk”, there are associated recurring costs – licensing, electricity, manpower etc. which are very much real and need to be factored in.
CIOs are also not yet fully sold on the idea of a pay-as-you-go or Consumption based IT cost model being as “variable” as it is touted to be. The likelihood of vendors introducing some “fixedness” ((start-up charges, launching charges, termination fees etc.) to the mix looms large. In some cases vendors have begun allaying the fears by agreeing to abstain from any “fixed” components in their cloud offerings. They also fear that their organization may not have the clout to motivate vendors to move to such an opex model. But that is likely to change as increasingly larger number of companies demand something similar.
CIOs have to worry about their in-house legacy applications too. The ones that cannot be hosted outside for a myriad of reasons. They can be though moved to private clouds – changing from “fixed charge allocation to business” cost model to consumption based usage charges. And then, what about the vendors who are not able to offer their capabilities as a cloud hosted subscription/usage based solution? Multiple vendors collaborating to resolve this is an option – one provides the application and other the cloud access.
Driving the Costs Down
This brings us to another very important point. While organizations may not be ready to move to a pay-to-go-model yet there is still a considerable opportunity to leverage the model to drive IT costs down. In theabsence of a consumption based “tariff”, a “Tragedy of the Commons” kind of scenario plays out where a shared common resource is inefficiently or sub-optimally utilized, somewhat like the scenario where free electricity or water leads to more waste and a lesser incentive to conserve. Currently IT cost reduction initiatives are generally periodic percentage cost reduction targets emanating from Corporate Finance or Business Leadership. IT generally responds by chipping away at cost elements to the best it can.
Transactional Cost Modeling for a Pay-as-you-go Approach
If organizations choose to move to the cloud or embrace a public cloud vendor or have everything on an in-house cloud, it is essential that they embed transactional cost modeling into the design of the IT solution or product. In this approach the IT architecture is used as a basis to model all the cost elements associated with the application. Even legacy applications can be modeled as if the applications were hosted on a private cloud and then can be benchmarked against the rates quoted by cloud providers like Amazon Web Services or Microsoft Azure for the same capabilities. Business users can be exposed to this cost comparison as an element of a “Go Out vs. Stay In” decision making.
The rigors of such a cost modeling exercise will also expose the business to discrete elements of IT costs which have hitherto remained hidden in larger silos. For example the cost associated with an IT solution or an application is not just the cost associated with the Production infrastructure. Even the Development and Test infrastructure that is used needs to be counted towards the operational cost of the application. If the company doesn’t have an offshore team or only works in two shifts then the development instances can be operational only during the actual hours of working reducing the number of hours in a year from 8760 to 2008 hours a reduction of over 75%.
Even production applications do not have to necessarily operate outside a time window and by building a flexible downtime capability within the application, one can schedule downtimes to minimize the costs outside the window of operations.
Conclusion
Failure to recognize the pay-as-you-go IT model as a platform disrupter can be very detrimental for organizations. The successful ones will leverage the trend to drive down costs and will reap the benefits. Adoption of the model does not need to be a revolutionary change as organizations can follow a more evolutionary approach by first focusing on it as a mechanism to get a better understanding of their own IT costs and endeavor to make them more “variable”, truly reflecting the demand patterns.

Search Google

Google

Site Meter