Monthly Archives: September 2009

Mainframes and the cloud – everything old is new again

Cloud computing, virtual machines. It’s big business. Amazon has its Elastic Compute Cloud (EC2) which provides “resizable compute capacity in the cloud“, Microsoft has Azure, providing “on-demand compute and storage to host, scale, and manage Web applications on the Internet” and Google’s offering is App Engine which offers “the ability to build and host web applications on Google’s infrastructure“. As you might know, I’m personally very taken with App Engine.

The offerings are slightly different – for example, while EC2 is bare virtual hardware, App Engine is a web application platform in the cloud. But they all have similar pricing arrangements, based generally on uptime or CPU time, I/O  and storage.

Does this seem familiar to you? It does to me, but then again, I did just turn 0x2B this month. In 1988 I was working in the Database Support Group at a major energy company in London, looking after the SAP R/2 databases, which were powered by IMS DB/DC, on MVS – yes, IBM big iron mainframes. I still look back on those days with fond memories.

In reviewing some 3rd party software, I wrote a document entitled “BMC Software’s Image Copy Plus: An Evaluation“. BMC’s Image Copy Plus was a product which offered faster image copies of our IMS DB (VSAM) databases. (Image Copy Plus, as well as IMS, is still around, over 20 years on! But that has to be the subject of another post).

One of the sections of the evaluation was to compare costs, as well as time — by how much would the backup costs be reduced using BMC’s offering?

And have a guess on what the cost comparison was based? Yes. CPU time, I/O (disk and tape EXCPs) and actual tapes.

Mainframe Job Billing

Everything old is new again.

SAP and Google Wave – Conversation Augmentation

It’s been pretty much six years to the day since I last wrote here about Dashboard, Nat Friedman’s project and implementation of a realtime contextual information system. So I thought it fitting to make a short demo showing integration between Google Wave and SAP, inspired by the cluepacket-driven style shown so nicely with Dashboard.

I got my Wave Sandbox account a week or so ago, and have had a bit of time to have a look at how robots and gadgets work — the two main Wave extension mechanisms. To get my feet wet, I built a robot, which is hosted in the cloud using Google App Engine, another area of interest to me, and the subject of this weblog entry. I used Python, but there’s also a Java client library available too. You can get more info in the API Overview.

What this robot does is listen to conversations in a Wave, automatically recognising SAP entities and augmenting the conversation by inserting extra contextual information directly into the flow. In this example, the robot can recognise transport requests, and will insert the request’s description into the conversation, lending a bit more information to what’s being discussed.

The robot recognises transport requests by looking for a pattern:

trkorr_match = re.search(' (SAPKw{6}|[A-Z0-9]{3}Kd{6}) ', text)

In other words, it’s looking for something starting SAPK followed by six further characters, or something starting with 3 characters, followed by a K and six digits (the more traditional customer-orientated request format). In either case, there must be a space before and a space following, to be more sure of it being a ‘word’.

How does it retrieve the description for a recognised transport request? Via a simple REST-orientated interface, of course :-) I use the excellent Internet Communication Framework (ICF) to build and host HTTP handlers so I can expose SAP functionality and data as resources in a uniform and controlled way. Each piece of data worth talking about is a first class citizen on the web; that is, each piece of data is a resource, and has a URL.

So the robot simply fetches the default representation of the recognised request’s ‘description’ resource. If the request was NSPK900115, the description resource’s URL would be something like:

http://hostname:port/transport/request/NSPK900115/description

Once fetched, the description is inserted into the conversation flow.

http://www.youtube.com/watch?v=G7W2M6H3OQo

(Originally written on SDN but republished here because of portal access issues)