hana.pngAfter a protracted stay in the Americas where Coffee is king and Tea is something that people associate with Kings and Colonies I recently transplanted myself in the UK and discovered that my recreational drinking habits have changed.  When I lived in the Middle East, bacon and pork products were almost forgotten. Accessibility, culture and a number of other factors play into the behavior change and I thought how similar this is to the topic of in-memory analytics.

 

We appear to have come full circle with systems when it comes to the topic of HANA or In Memory analytics. Admittedly the pitch has changed. As compared with the IBM System 34 that I worked on in the 1990’s we’re less constrained by some of the hardware characteristics that dogged those kinds of systems. At the same time though, we were more pragmatic about our data, we knew there were technical limitations associated with 8” floppy disks and 64k of RAM. Accordingly, data and records were purged periodically and our system maintained very lean data attributes. There was a lot more manual work of course and desktop PC’s were quite commonplace, though the battle scarred data processing folks often said they preferred the predictable and resilient experience of working with a terminal unless the keys stuck, the CRT fused or the network gave out. The fundamental concepts though, remain largely the same. You use RAM to do your processing because storage devices like disks (floppy or hard) take too long to read and write the data.

 

Although some of the aspects of HANA are new, the concept of yet another piece of SAP infrastructure is probably a little disconcerting. After all, wasn’t APO supposed to be the planning killer, the uptake and even dependence on this SAP add-on seems to have been lack lustre or at best, understated? Lora Cicere in a blog post commented on how she felt HANA could help SAP “redefine enterprise applications”. Interestingly one of the fundamental characteristics of APO is a thing called ‘LiveCache’ – it seems, simplistically at least, that LiveCache has a lot of parallels with HANA. In one SAP presentation I saw APO was in fact described as leveraging early In Memory technology by using LiveCache and MaxDB. In fact Mark Chapman, Principal BI Consultant, at Bluefin Solutions points out that HANA isn’t really new technology at all.

 

Some interesting use cases have emerged in early implementations of HANA, the problems that HANA is tackling are undoubtedly relevant to real use cases and probably reflect on initiatives that were perhaps previously embarked on using traditional technologies but which faltered due to the length of time it took to get back meaningful data. Even when something meaningful did come back, often that was a trigger for yet another report or query with slightly different characteristics which again would take a long time to result in something useful. We can conclude then that HANA has an inevitable and very relevant applicability to most large organizations or organizations with big data.

 

It is probably equally interesting that historically we didn’t have great ways to get all the kinds of data into the system that our SAP systems supported. That was unless we had deep pockets and to apply to a lot of hands-on-keyboards manpower. Another option would be for complex interface building but again, when there are as many fish to fry in a large SAP implementation, seemingly non value adding or non-growth or revenue-generating data initiatives don’t get a high priority.

 

At Winshuttle for example, we’re seeing increased demand for more comprehensive information and real-time analytics in SAP systems from a slightly different angle. The demands are evident by the increase in the number of obscure and data supporting Transactions that customers want to use to do recordings with Winshuttle Transaction. What this potentially means for infrastructure products like HANA is that the expectations of business for being able to slice and dice more data are likely going to increase exponentially. Those queries that the back office were told that they could only run in off peak times or on weekends will now likely be run anytime. In addition, consider the fact that some queries are never even attempted against productive OLTP systems because there is a very real fear that the system performance will degrade (particularly with badly written queries) and lead to a bad experience for other users in dialog sessions. Instead the business was told that they would have to rely on BI/BW for those reports. Reports that were often stale by the time they had run, or worse, were lacking in some data attributes not defined in the cube and as a consequence they may not be very useful for near-real-time tactical planning.

 

HANA may allay some of these fears particularly if the scenarios implemented, are aligned with some of the system architecture designs in play such as a real-time synch with the OLTP. It is being suggested that this design will not only help with planning data and analytics but also speed up online transaction processing. What I can also conclude is that rather than just resolving an existing problem with planning and analytics, HANA will offer some new avenues to business to expand the footprint of SAP within the business. Instead of relying on a combination of technologies (Excel included) for planning, reporting, record keeping and decision making business will start pouring that much more data into the SAP landscape.

 

All this will become the proverbial double edge sword for IT and the business. High performance database technology will allow one to arrive at conclusions faster. Business will demand or make changes or corrections with greater frequency and in an accelerated way. IT will be expected to respond. Part of the arsenal for achieving this will likely see retention of the legacy infrastructure that is currently in play but certainly it will be exciting times for both SAP partners, in-house IT and system integration partners. So in conclusion I guess what I am expecting is some behavior changes in business and IT. Just as tastes are different across the globe, one adapts ones preferences to what is available and the context in which one finds oneself. Our ability to do more with HANA may mean that our Big Data, as Winshuttle partner Optimal Solutions says, will simply get…. bigger.

Categories: TechNews