Archive for the 'evangelism' Category

Open Source redux

While I don’t update here much anymore that’s mostly because I’ve not been active in the general technology scene for the last 2.5 years following my departure from Dell and the resultant non-compete. I’m taking a few easy steps back now, I’ve reactivated my British Computer Society (BCS) Fellow membership and am hoping to participate in their Open Source Specialist Group meeting and AGM on October 25th.

MS-DOS Open Source

msdos-logo-150x150[1]Interestingly, Microsoft have announced they are re-open sourcing the code for MS-DOS 1.25 and 2.0 releases. Although never available outside of Microsoft or IBM in its entirety, there were certainly sections o the code floating around in the mid-1980’s. I was given the code for some drivers in 1984 by an IBM Systems Engineer, which I proceeded to hack and use as a starter for the 3270 driver I used for file transfer.

I’ve got a copy of the code released by Microsoft, and other the next 6-months am going to set about compiling it and working to get to work on a PC as a way to re-introduce myself to working in PC Assembler and the current state of compilers.

Remembering the dawn of the open source movement

and this isn’t it.

attwood statistics 1975

Me re-booting an IBM System 360/40 in 1975

When I first started in IT in 1974 or as it was called back then, data processing, open source was the only thing. People were already depending on it, and defending their right to access source code.

I’m delighted with the number and breadth of formal organizations that have grown-up around “open source”. They are a great thing. Strength comes in numbers, as does recognition and bargaining power. Congratulations to the Open Source Initiative and everything they’ve achieved in their 20-years.

I understand the difference between closed source, (restrictive) licensed source code, free source, open source etc. The point here isn’t to argue one over the other, but to merely illustrate the lineage that has led to where we are today.

Perhaps one of the more significant steps in the modern open source movement was the creation in 2000 of the Open Source Development Labs, (OSDL) which in 2007 merged with the Free Standards Group (FSG) to become the Linux Foundation. But of course source code didn’t start there.

Some people feel that the source code fissure was opened when  Linus Torvalds released his Linux operating system in 1991 as open source; while Linus and many others think the work by Richard Stallman on the GNU Toolset and GNU License started in 1983, was the first step. Stallman’s determined advocacy for source code rights and source access certainly was a big contributor to where open source is today.

But it started way before Stallman. Open source can not only trace its roots to two of the industries behemoths, IBM and AT&T, but the original advocacy came from them too. Back in the early 1960’s, open source was the only thing. There wasn’t a software industry per se until the US Government invoked its’ antitrust law against IBM and AT&T, eventually forcing them, among other things, to unbundle their software and make it separately available as well as many other related conditions.

’69 is the beginning, not the end

The U.S. vs.I.B.M. antitrust case started in 1969, with trial commencing in 1975(1). The case was specifically about IBM blocking competitive hardware makers getting access and customers being able to run competitive systems, primarily S/360 architecture, using IBM Software.

In the years leading up to 1969, customers had become increasingly frustrated, and angry at IBM’s policy to tie it’s software to its hardware. Since all the software at that time was source code available, what that really meant was a business HAD to have one IBM computer to get the source code, it could then purchase an IBM plug-compatible manufacturers (PCM) computer(2) and compile the source code with the manufacturers Assembler and tools, then run the binaries on the PCM systems.

IBM made this increasingly harder as the PCM systems became more competitive. Often large previously IBM only systems users who would have, 2, 4, sometimes even 6 IBM S/360 systems, costing tens of millions of dollars, would buy a single PCM computer. The IBM on-site systems engineers (SE) could see the struggles of the customer, and along with the customers themselves, started to push back against the policy. The SE job was made harder the more their hands were tied, and the more restrictions that were put on the source code.

To SHARE or not to?

For the customers in the US, one of their major user groups, SHARE had
a vast experience in source code distribution, it’s user created content, tools tapes were legend, what most never knew, is that back in 1959, with General Motors, SHARE had its own IBM mainframe (709) operating system, the SHARE Operating System (SOS).

At that time there was formal support offerings of on-site SE’s that would work on problems and defects in SOS. But by 1962, IBM had introduced it’s own S/7090 Operating System, which was both incompatible with SOS, and also at that time IBM withdrew support by it’s SE and Program Support Representatives (PSR’s) to work on SOS.

1965 is where to the best of my knowledge is when the open source code movement, as we know it today, started

To my knowledge, that’s where the open source code movement, as we know it today, started. Stallman’s experience with a printer driver mirrors exactly what had happened some 20-years before. The removal of source code, the inability to build working modifications to support a business initiative, using hardware and software ostentatiously already owned by the customer.

IBM made it increasingly harder to get the source code, until the antitrust case. By that time, many of IBMs customers had created and depended on small, and large modifications to IBM source code.

Antitrust outcomes

Computerworld - IBM OCOBy the mid-70’s, once of the results of years of litigation, and consent decrees in the United States, IBM had been required to unbundle its software, and make it available separately. Initially it was chargeable to customers who wanted to run it on PCM, non-IBM systems, but overtime as new releases and new function appeared, even customers with IBM systems saw a charge appear, especially as Field Developed Programs, moved to full Program Products and so on. In a bid to stop competing products, and user group offerings being developed from their products, this meant the IBM Products were increasingly supplied object-code-only (OCO). This became a a formal policy in 1983.

I’ve kept the press cutting from ComputerWorld(March 1985) shown above since my days at Chemical Bank in New York. It pretty much sums-up what was going on at the time, OCO and users and user groups fighting back against IBM.

What this also did is it gave life to the formal software market, companies were now used to paying for their software, we’ve never looked back. In the time since those days, software with source code available has continued to flourish. With each new twist and evolution of technology, open source thrives, finds it’s own place, sometimes a dominant position, sometimes subservient, in the background.

The times in the late 1950’s and 60’s were the dawn of open source. If users, programmers, researchers and scientists had not fought for their rights then, it is hard to know where the software industry would be now.

Footnotes

(1) The PCM industry had itself come about as a result of a 1956 antitrust case and the consent decree that followed.

(2) The 1969 antitrust case was eventually abandoned in 1982.

Open letter: CD Recycling

Dear IT Industry Colleague,

I’ve just moved house. In the process I realised that I had hundreds of old datas CD’s. Some of them with old backups, many of them used to copy copies of other CD’s some DVD’s with dumps of system folders and so on a so forth.

I figured I’d just dump them in the recycling, which gets collected bi-weekly. On checking though, not only are these not recyclable, but they are actually pretty hard to completely destroy. They also contain a large amount of toxic chemicals, and unless they are sent to a specialty recycling center, most end up in incinerators or landfill, neither is a good thing.

There is a good article here on the general problems with the creation and disposal of CD/DVD’s, from 2013. It says, among other things:

The discs are made of layers of different mixed materials, including a combination of various mined metals and petroleum–derived plastics, lacquers and dyes, which, when disposed of, can pollute groundwater and bring on a myriad of health problems. Most jewel cases are made of polyvinyl chloride (PVC), which has been thought to produce a higher-than-normal cancer rate within workers and those who live in the area where it is manufactured. They also release harmful chemicals when incinerated.

Having realized the problems, what did I do? First, when disposing old data CD’s and DVD’s you must understand there is an obvious potential security exposure. In principle, any data can be read from the CD. In practice, it may not be that simple if the data is formatted using specific backup programs, encrypted etc. But you do have to consider this before discarding them.

man eating cdI came up with a couple of easy ways to make recovering data hard. One involved scratching the recording sides (remember, some are dual sided). The scratches can be removed but it’s a time consuming process and not something done by someone who casually comes across your CD.

The second process used a nail in a set of grips, I heated the nail and simply pushed a couple of holes through each CD/DVD. Again, some data could still be read by the determined, but very unlikely.

IMG_20160609_182555Once I was done marking all the media, I threw them in an old Amazon box, too them to the US Post Office, mailed them as “media mail” to the CD Recycling Center of America. The CD Recycling Center provides “certified destruction” of your CD’s.

Our industry uses vast amounts of natural resources, it consumes rare minerals at an alarming rate, often mined in difficult, dangerous, and sometimes illegal conditions. Individually this is hard for us to do anything about. Please though, don’t throw old data CDs, DVD’s or any others in the garbage/trash/refuse and especially the recycling.

Yes, it takes a few minutes of your time; yes, it will cost you to box, tape, address and actually post the package back for destruction. Over the years IT has made me a lot of money, it is the least I could do. Please join me. Thank you.

 

(My) Influential Women in Tech

Taking some time out of work in the technical, software, computer industry has been really helpful to give my brain time to sift through the required, the necessary, the nice, and the pointless things that I’ve been involved in over 41-years in technology.

international-womens-day-logo1[1]Given that today is International Women’s Day 2016 and numerous tweets have flown by celebrating women, and given the people I follow, many women in Technology. I thought I’d take a minute to note some of the great women in Tech I had the opportunity to work with.

I was fortunate in that I spent much of my career at IBM. There is no doubt that IBM was a progressive employer on all fronts, women, minorities, the physically challenged, and that continues today with their unrelenting endorsement of the LGBT community. I never personally met or worked with current IBM CEO, Ginni Rometty, she like many that I did have the opportunity to work with, started out in Systems Engineering and moved into management. Those that I worked with included Barbara McDuffie, Leslie Wilkes, Linda Sanford and many others.

Among those in management at IBM that were most influential, Anona Amis at IBM UK. Anona was my manager in 1989-1990, at a time when I was frustrated and lacking direction after joining IBM two years earlier, with high hopes of doing important things. Anona, in the period of a year, taught me both how to value my contributions, but also how to make more valuable contributions. She was one of what I grew to learn, was the backbone of IBM, professional managers.

My four women of tech, may at sometime or other, have been managers. That though wasn’t why I was inspired by them.

Susan Malika: Sue, I met Sue initially through the CICS Product group, when we were first looking at ways to interface a web server to the CICS Transaction Monitor. Sue and the team already had a prototype connector implemented as a CGI. Over the coming years, I was influenced by Sue in a number of fields, especially in data interchange and her work on XML. Sue is still active in tech.

Peggy Zagelow: I’d always been pretty dismissive of databases, apart from a brief period with SQL/DS; I’d always managed fine without one. Early on in the days of evangelizing Java, I was routed to the IBM Santa Teresa lab, on an ad hoc query from Peggy about using Java as a procedures language for DB2. Her enthusiasm, and dogma about the structured, relational database; as well as her ability to code eloquently in Assembler was an inspiration. We later wrote a paper together, still available online[here]. Peggy is also still active in the tech sector at IBM.

Donna Dillenberger: Sometime in 1999, Donna and the then President of the IBM Academy of Technology, Ian Brackenbury, came to the IBM Bedfont office to discuss some ideas I had on making the Java Virtual Machine viable on large scale mainframe servers. Donna, translated a group of unconnected ideas and concepts I sketched out on a white board, into the “Scalable JVM”. The evolution of the JVM was a key stepping stone in the IBM evolution of Java. I’m pleased to see Donna was appointed an IBM Fellow in 2015. The paper on the JVM is here.(1).

Gerry Hackett: Finally, but most importantly, Geraldine aka Gerry Hackett. Gerry and I  met when she was a first line development manager in the IBM Virtual Machine development laboratory in Endicott New York, sometime around 1985. While Gerry would normally fall in the category of management, she is most steadfastly still an amazing technologist. Some years later I had the [dubious] pleasure of “flipping slides” for her as Gerry presented IBM Strategy. Aside: “Todays generation will never understand the tension between a speaker and a slide turner.” Today, Gerry is a Vice President at Dell. She recruited me to work at Dell in 2009, and under her leadership the firmware and embedded management team have made steady progress, and implemented some great ideas. Gerry has been a longtime advocate for women in technology, a career mentor, and a fantastic roll model.

Importantly, what all these women demonstrated, by the “bucketload”, was quiet, technological confidence; the ability to see, deliver and celebrate great ideas and great people. They were quiet unlike their male peers, not in achievement, but in approach. This why we need more women in technology, not because they are women, but because technical companies, and their products will not be as good without them.

(1). Edited to link to correct Dillenberger et al paper.

Net neutrality and the FCC

If you have not seen John Oliver lay bare the FCC Net Neutrality proposal, you must. You can find it here.

I’ve shared it widely among my facebook friends and implored them to email or contact the FCC. In the last few days I’ve been asked what I wrote. Here is my letter in full. I’ve taken the liberty to mark up a few minor corrections I wish I’d made before I sent it.

To: openinternet@fcc.gov

Sir, Madam,
I am a Legal Foreign Resident living and working in Austin Texas, I have lived in here 8-years, and for 2-years before that in New York State. I am an Executive Director and Senior Distinguished Engineer at Dell Computer, I write though as a private individual.
The current service I receive, in a residential Street, just 1-mile from the City of Austin City Hall, is expensive, slow an the only option available to me. Yes, since the google fiber announcement, both Grande Communications and AT&T have announced new offerings, but neither is available. [on my street]
Net net, I have no choice. Now, you are proposing that the relatively expensive cable service I and my neighbors receive will now be subject to a further direct or indirect charge.
It’s not clear at all that I can give away anymore of my privacy to the massive cable conglomerates, so the only way your proposal would work is either I pay the cable company more money for premium traffic, or I pay the content companies more for premium content. The end result to me though is I will be paying more for the same service I had last year.
This proposed dual speed internet traffic regulation must NOT be implemented. It is anti-competitive, allowing larger companies to out spend smaller, new companies; it is monopolistic, it allows the incumbent cable companies to shore-up their already entrenched positions but by charging effectively, more for the same. Ultimately, this proposal further allows monopolistic providers to shake down individuals and content companies with for no real benefit.
US Cable Broadband is shockingly expensive. I regularly travel the world on business and I’m sure you have the data that shows that the USA, in general, is both more expensive per Mb per second, and has slower transport speeds to home users in cities like Austin, than do cities in what a few years ago we would have considered third world.
The price of Internet connection to homes, at a speed that makes modern home working, web conferencing etc. effective is now becoming an inhibitor. Soon it will have an affect on hiring patterns and wages for entry level positions. This cannot be what you want? Are video conferencing companies, web cam/whiteboard services premium content?
Please ensure that the Net Neutrality proposal is NOT implemented and revisit cable Internet as a common carrier.
—————————-

 

Linkedin Pulse – More walls and barriers

Dell has been encouraging team members to post to the new linkedin Pulse, which came to LinkedIn as part of a $90m acquisition last year. I’ve been looking at it from my phone and tablet while travelling and wrote this summary today, I thought it worth sharing here with many of my tech’ savvy friends, colleagues, and readers.

http://www.victoriana.com/gardening/winsford.htm“I won’t be participating. I assume you never got answers to my questions,  mostly because as expected,  LinkedIn are trying to recreate the best of the Web inside a walled garden.

That is, they appear to have introduced a new comment and groups system,  are encouraging posts which are really better suited to either public blogs,  news websites,  or company websites.

I can’t work out the link semantics,  making it difficult to share anything outside linkedin.  There seems to be little or no moderation on comments and discussions.  The article on CISO and Target is a good example,  follow on to the comments.

I suspect they won’t make full articles available to search engines,  and you will have to be logged in to LinkedIn to read them.

Since I’m not invested in LinkedIn,  and not in classical marketing,  I can’t see any value of helping LinkedIn recreate the Internet inside linkedin. ”

Did I get it wrong? LinkedIn is trying to be the facebook place where recruiters and marketing people hangout.

 

BUILD Madison

Last week I had the opportunity to go to the Dell Software Group lab at Madison, Wisconsin. I’d been there briefly once before, and was impressed with the energy and community involvement, this trip was for their 3rd annual build-a-thon, managed and excellently compared by location manager Tom Willis. I was joined by Doug Wright, Doug manages our common engineering team, which includes some team members in Madison.

The event was a 24hr hack-a-thon, where the R&D staff submitted ideas in advance, formed teams and went at the problem in a fun, team, relaxed environment. Projects didn’t have to be specifically work based, and one team took the opportunity to build a tele-presence robot using lego(r) and an Android tablet.

I was really refreshing to see programmers going at challenges in only 24hrs, even in an environment where we do code releases for some products every 10-days, it reminded me of the CIP(continuous improvement programmes)  projects we used to run back in the mid-1980’s where we couldn’t get code out fast enough for the explosive PC application demand. It was idea, code, evaluate, cleanup, ship and then re-evaluate, code, “lather, rinse, repeat”.

The projects for build were evaluated awarded points based on their success criteria, except the Directors award which was chosen. Extra points were given if the solution used the Common User Interface UX framework, and automated testing, as well as being grouped around key Dell initiatives.

The two main winners, with lots of honorable mentions. The main winner was “Project Timber”, a distributed log aggregator. Although I’d already declared my hand over on the @dsgbuild twitter account, looking at the other projects, Doug and I decided on the CUI Builder as the Directors award. Overall it was great to be around, congrats to Tom for organizing, and especially to Jenna for keeping the food and snacks flowing and the midnight root beer shakes. Finally, a special mention for the video confession booth, great idea, and the edited video was either really funny, or I was really tired…

For Dell Software Group employees, you can find more details and pictures here, on commons.

This slideshow requires JavaScript.

REPLY-TO-ALL storms

One thing that I’ve come to loathe is a “reply-to-all” email storm. They happen on all sorts of email systems, are are often made worse by making the reply-to-all but the default, rather than an option. They are also compounded by distribution lists, especially in big organizations. We had the perfect-storm Friday afternoon, an errant email addressed to an-all-org distribution list.

Peoples inability to use the advanced features the tools they spend so much of their time using, and their willingness to compound the problem with banal, stupid, inconsiderate and just thoughtless responses, also sent reply-to-all, not only astounds me, it frustrates me.

Yes I’m sure you want to be removed from this email chain; yes I know you want people to stop using reply-to-all but sending the response reply-to-all just shows you’ve not thought it through, and don’t know how to use your tools. I understand the problem is compounded by mobile use, where you don’t have the same technology, and having your blackberry vibrate in your pocket 200x can be a little overkill.

educateIf you have Microsoft Outlook, and especially Outlook 2010. Then help is at hand. I keep a couple of Quicksteps for these occaisions, the first is called REPLYTOALL and the second is called EDUCATE. REPLYTOALL just replies to the sender pointing out how useless their reply-to-all was. The quickstep changes the subject, adds the text, signs the email, and sends it, finally it deletes the original message.

If they reply, and this is known since the subject is changed, then I use EDUCATE to reply. It sends the following reply, again changing the subject, adding the text, signing the email, sending it, and finally deleting their response.

You replied to an email chain by using reply to all… asking to be removed or similar… a completely pointless effort that just added to the problem.

Interestingly you simply don’t have to do that. You have three choices that don’t need a reply at all, let alone a reply to all.

  1. After you have read the first message where you decide you don’t need to read anymore, from the inbox view right click on that message and choose ignore
  2. Right click the message and select ALWAYS MOVE MESSAGES IN THIS CONVERSION and then select deleted items. Right click and select create rule(this is more complex than the two above but can achieve the same thing)
  3. What I set up sometime ago is a quickstep. It’s easy to do, its called REPLYTOALL. I select the messages and click the QUICKSTEP, it is set up to send the reply you got, but also deletes the original email, and I just got the chance to update it to change the Subject, as per this reply, to READ ME!And yes, just in case you wondered, yes this reply came from another QUICKSTEP.

Some of us actually think email is still a vehicle for communicating ideas, not just the quickest way to abdicate responsibility for doing anything.

 

When Patents Attack, Part Deux

I’m superpumped and have my headphones in, in my cube in the Dell “mau5hau5”. This weekends “This American Life” has returned to the topics of patents, especially software patents. Their June 2011 original episode was a classic expose of the patent system, many of the seemingly ridiculous conundrums that it involves, and why the patent system still embodies the classic American wild west way of making money, it’s a halfway house between a shakedown and a goldrush.

Interestingly patents have been at the front of my mind recently. A couple of weeks ago I attended the annual Dell patent awards dinner. Michael Dell was in attendance along with many of the Dell senior technical staff and executives, including my current and prior boss, legal and the Dell Inventor of the Year. All great stuff. I have no patents. This year I’ve declined to be named on two patents. One, which uses NFC, was soley my idea, I pushed to get it considered, I did the initial design, the software and app design. Yet, two or three people who were luke warm to the idea, are being named on the patent. They’ve actually been working on the actual design implementation.

This is good stuff. Morally and intellectually I’ve been against patents, especially software patents since they came into being. Between 1979-1987 I learned my craft, most of my skills from reading IBM source code. During that time, IBM for various reasons, many misguided, some legal, slowly withdrew source code. These days few would ever consider being able to read the entire original source code for their products, while others, in the Linux community but increasing the wbe and database and applications, wouldn’t consider running or using a product that didn’t have source code.

And so it was, throughout my career at IBM I declined numerous(approx. 15) to be named on a patent. It cost me financially through lack of awards, but not in promotion and pay increases. However, mostly through the relationship I had with IBM Senior Vice President, Nick Donofrio, I learned the value to the company of patents and why it was essential. The same has been true here at Dell(approx 6.). So while the system exists, companies at least have to play the game.

When a widely popular, and broadly heard program such as This American Life gets involved, you know the end is coming. Grab your headphones and listen along to “When Patents Attack Part II”.

Large-scale Software Engineering at High Speed

I very much enjoyed presenting the Distinguished Lecture at this years Texas A&M Industrial Affiliates Program, and have uploaded my slides to slideshare.net. I had the opportunity to review and judge a number of the under grad and doctorate poster sessions, and was impressed with both the breadth of the ideas being explored and the depth of the doctoral thesis topics. Some very imaginative projects. I liked a couple so much I’m going to make an effort to get them in as summer interns here at Dell.

Visiting Universities and especially Computer Science classes is always fascinating, any trends that are going to happen are often really visible in this type of environment. What I noticed, and I shouldn’t have been surprised, was a number of this year under-grad class with projects using Android phones and bluetooth, combined with GPS. There were proximity projects, location awareness projects, directions finding projects and more. None really required a GSM contract.

What this indicates is that Android based mobile phones are becoming generalized computing platforms, not just smart phones. Of course, if they are doing this at Texas A&M, similar projects will be running at other universities all over the world. The knowledge, education and development in this space will push the next generation apps, often around the same platform. Back in 1998, I visited Warwick and a number of other Universities and this convinced me Linux was coming.

Today, we’re in an era where speed is of the essence.

  • It’s critical for competitive reasons to stay ahead of the competition.
  • The customer expectation for the Internet is much higher.

Engineering is a discipline.

  • Foundation: I look for people who have a firm foundation in engineering and treat it like a discipline
  • Definition: Look up engineering on Wikipedia and the first descriptive word behind it is discipline (followed by art and profession)
  • Software: Software is NOT [treated like] an engineering discipline today.[It’s all about invention]
  • Discipline: The key to success: We have to get better as a profession at treating software engineering like a discipline.

Culture is critical.

  • Garage band?: The days of four or five people starting out of their garage and working that way is less common now[in the enterprise app space, but increasing IS in the personal space].
  • Big=global. Most big projects are globally distributed and developed.
  • Global differences. The attitude and approach of teams in the USA, India, and China will be vastly different.
  • Play to cultural strengths. Adapt to cultural strengths — understand, and use them to your benefit.

Process matters.

  • The “how”: It’s not just what you’re doing, but how you do it.
  • No surprises: Good ≠ good; Bad ≠ bad
  • Communicate: Overcommunicate if needed, but make sure people understand and are aligned.
  • Incremental works: Let people see checkpoints where they can gauge progress and give individual groups/teams a chance to report out. You don’t need everyone to review.

Architecture must support the engineering.

  • No roadblock: Architecture can’t get in the way of engineering/development.
  • Good architecture: Allows people to work effectively in a globally distributed environment.
  • Vertical no more: Silos were the old way; technology grew up vertically.
  • Alignment: The way people think about constructing systems needs to match the engineering.
  • x86 and Cloud: Both allow for globally distributed environment (open source is another example)

Customer First: Dells Software Approach

  • Starting anew: Starting and building from scratch
  • Integration: It’s all about bringing elements together
  • Customer choice: We’re taking a different approach, delivering customer choice through open, horizontal integration (customers can choose hardware – storage, networking – and hypervisor)

Thanks to Michael Conway in Dell Product Group for helping me crystallize my thoughts into a concise structure for the slides. Also to Dr Valerie Taylor, Department Head and Royce E. Wisenbaker Professorship in Engineering at Texas A&M for the invitation and for hosting my visit.

As always, if you have any comments or feedback, please feel free to post here, or via email.

Community involvement or free labor?

I’ve been following Andrew McAfee’s blog for a couple of weeks now, as a result of someone twittering a link to one of his blog entries. In his latest blog post, “Three Mantras“, McAfee discusses something many of the tech industry will recognize, self support systems. McAfee nicely summarizes the business opportunity to build around online communities as support subsystems.

I posted some of my thoughts on the topic in comments, namely the question of recognition and reward, not for participating, but for those that stay on and continue to participate. Initial participation is often self rewarding, we go look for help, experience or education in order to achieve some work related task. Need to get some help using a particular programming language or API – as Apple could have said “there’s a community forum for that!”

What makes McAfees blog interesting is his recognition of this phenomena, and his translation of it in to business terms and impact. For Dell, the guys that are part of the TechCenter have been doing a great job recently of creating knowledge and sharing it. They’ve recently run a number of demo and tech sessions on some of our key management technologies. You can find the Dell TechCenter here. It provides links into a wiki, Discussion Forums(just like the ones discussed by McAfee, the techcenter forum currently has some 34,000 topics) and the increasingly popular TechTuesday chats.

As I said in a comment to McAfee’s blog, this isn’t a new phenomena, as long ago as the late 1970’s I was first introduced to VMSHARE. A User run online bulletinboard/time sharing system aka forum to support and help users of IBM’s VM/370 operating system. While today there are many, many more forums, technologies and places to go for help, you can gain as much value from them today as I did then, because, and thats especially true for the Dell Techcenter, the people who participate are knowledgeable, dedicated and passionate about what they do, otherwise they wouldn’t do it.

Profiles in, err, courage

Back in March I caught an early morning bus on Saturday to downtown Austin to attend Bar Camp IV, suffice to say it’s mostly not a bar, and doesn’t involve camping(anymore).

I attend a few interesting sessions, I learned a few things about Windows 7, mobile development and attended a session on airships and blimps that I assumed was some kind of coded language for a session on clouds, but it wasn’t it WAS about airships and blimps and more.

Big-up to @whurley @sarad and @linearb for organising and to the various sponsors which included not only free attendance, but also free lunch and libations.

I was on my way out when I bumped into Texas Social Media Awards finalist and local tech analyst and sometime contact, Michael Cote from Redmonk. We passed the time of day, and he asked me if I wanted to be interviewed for a podcast, why not?

I learned a ton about Cote from the interview, mostly that he doesn’t forget anything. We’ve met probably 5-6 times in the past and he seemed to pull 1x question from each discussion. I mostly laughed the whole way through, I thought it was going to be a tech discussion, and we did touch on a few topics, but it was just a fun way to spend 10-mins. You can hear the podcast and read the liner notes here on Redmonk Radio Episode 55. – And no, I have no idea why the series was called “profiles in courage”, why I was selected, a why I giggled all the way through. It’s been a while since I did my press training, I don’t remember them telling us about giggling as a technique!

Is SOA dead?

There has been a lot of fuss since the start of the new year around the theme “SOA is dead”. Much of this has been attributed to Anne Thomas Manes blog entry on the Burton Groups blog, here.

Infoworlds Paul Krill jumper on the bandwagon with a SOA obituary, qouting Annes work and say “SOA is dead but services will live on”. A quick fire response came on a number of fronts, like this one from Duane Nickull at Adobe, and then this from James Governor at Redmonk, where he charismatically claims, “everything is dead”.

First up, many times in my career, and James touches on a few of the key ones, since we were there together, or rather, I took advantage of his newness and thirst for knowledge as a junior reporter, to explain to him how mainframes worked, and what the software could be made to do. I knew from 10-years before I met James that evangelists and those with an agenda, would often claim something was “dead”. It came from the early 1980’s mainframe “wars” – yes, before there was a PC, we were having our own internal battles, this was dead, that was dead, etc.

What I learned from that experience, is that technical people form crowds. Just like the public hangings in the middle ages, they are all too quick to stand around and shout “hang-him”. These days it’s a bit more complex, first off there’s Slashdot, then we have the modern equivalent of speakers corner, aka blogs, where often those who shout loudest and most frequently, get heard more often. However, what most people want is not a one sided rant, but to understand the issues. Claiming anything is dead often gives the claimer the right not to understand the thing that is supposedly “dead” but to just give reasons why that must be so and move on to give advice on what you should do instead. It was similar debate last year that motivated me to document my “evangelsim” years on the about page on my blog.

The first time I heard SOA is dead, wasn’t Annes blog, it wasn’t even as John Willis, aka botchagalupe on twitter, claims in his cloud drop #38 him and Michael Cote of Redmonk last year. No sir, it was back in June 2007, when theregister.co.uk reprinted a Clive Longbottom, Head of Research at Quocirca, under the headline SOA – Dead or Alive?

Clive got closest to the real reasons of why SOA came about, in my opinion, and thus why SOA will prevail, despite rumours of its’ demise. It is not just services, from my perspective, it is about truly transactional services, which are often part of a workflow process.

Not that I’m about to claim that IBM invited SOA, or that my role in either the IBM SWG SOA initiative, or the IBM STG services initiative was anything other than as a team player rather than as a lead. However, I did spend much of 2003/4 working across both divisions, trying to explain the differences and similarities between the two, and why one needed the other, or at least its relationships. And then IBM marketed the heck out of SOA.

One of the things we wanted to do was to unite the different server types around a common messaging and event architecture. There was  almost no requirement for this to be syncronous and a lot of reasons for it to be services based. Many of us had just come from the evolution of object technology inside IBM and from working on making Java efficient within our servers. Thus, as services based approach seemed for many reasons the best one. 

However, when you looked at the types of messages and events that would be sent between systems, many of them could be cruicial to effective and efficient running of the infrastructure, they had in effect, transactional charateristics. That is, given Event-a could initiate actions A, then b, then c and finally d. While action-d could be started before action-c, it couldn’t be started until action-b was completed, and this was dependent on action-a. Importantally, none of these actions should be performed more than once for each instance of an event.

Think failure of a database or transactional server. Create new virtual server, boot os, start application/database server, rollback incomplete transactions, take over network etc. Or similar.

Around the same time, inside IBM, Beth Hutchison and others at IBM Hursley, along with smart people like Steve Graham, now at EMC, and Mandy Chessell also of IBM Hursley were trying to solve similar trascational type problems over http and using web services.

While the Server group folks headed down the Grid, Grid Services and ultimately Web Service Resource  Framework, inside IBM we came to the same conclusion, incompatible messages, incompatible systems, different architectures, legacy systems etc. need to interoperate and for that you need a framework and set of guidelines. Build this out from an infrastructure layer, to an application level; add in customer applications and that framework; and then scale it in any meaningful, that need more than a few programmers working concurrently on the same code, or on the same set of services, and what you needed was a services oriented architecture.

Now, I completely get the REST style of implementation and programming. There is no doubt that it could take over the world. From the perspective of those frantically building web mashups and cloud designs, already has. In none of the “SOA is dead” articles has anyone effectively discussed syncronous transactions, in fact apart from Clive Longbottoms piece, no real discussion was given to workflow, let alone the atomic transaction.

I’m not in denial here of what Amazon and Google are doing. Sure both do transactions, both were built from the ground-up around a services based architecture. Now, many of those who argue that “SOA is dead” are often those who want to move onto the emporers new clothes. However, as fast as applications are being moved to the cloud, many businesses are nowhere in sight moving or exploiting the cloud. To help them get there, they’ll need to know how to do it and for that they’ll need a roadmap, a framework and set of guidelines, and if it includes their legacy applications and systems, how they get there, For that, they’ll likely need more than a strategy, they’ll need a services “oriented” architecture.

So, I guess we’ve arrived at the end, the same conclusion that many others have come to. But for me it is always about context.

I have to run now, literally. My weekly long run is Sunday afternoon and my running buddy @mstoonces will show up any minute. Also, given I’m starting my new job, I’m not sure how much time I’ll have to respond to your comments, but I welcome the discussion!

Clouds and the governor

I’ve been meaning to respond to Monkchips speculation over IBM and Amazon from last year his follow-up why Amazon don’t need IBM. James and I met-up briefly before Christmas, the day I resigned from IBM UK but we ran out of time to discuss that. I wrote and posted a draft and never got around to finishing it, I was missing context. Then yesterday James published a blog entry entitled “15 Ways to Tell Its Not Cloud Computing”.

The straw that broke the camels back was today, on chinposin Friday, James was clearly hustling for a bite when he tweeted “amazed i didn’t get more play for cloud computing blog”.

Well here you go James. Your analysis and simple list of 15-reasons why it is not a cloud is entertaining, but it’s not analysis, it’s cheerleading.

I’m not going to trawl through the list and dissect it one by one, I’ll just go with the first entry and then revert to discussing the bigger issue. James says “If you peel back the label and its says “Grid” or “OGSA” underneath… its not a cloud.” – Why is that James? How do you advocate organizations build clouds?
Continue reading ‘Clouds and the governor’

IBM’s new Enterprise Data Center vision

IBM announced today our new Enterprise Data Center vision. There are lots of links from the new ibm.com/datacenter web page which split out into their various constituencies Virtualization, Energy Efficiency, Security, Business resiliency and IT service delivery.

To net it out from my perspective though, there is a lot of good technology behind this, and an interesting direction summarized nicely starting on page-10 on the POV paper linked from the new data center page or here.

What it lays out are the three main stages of adoption for the new data center, simplified, shared and dynamic. The Clabby analytics paper, also linked from the new data center page or here, puts the three stages in a more consumable practical tabular format.

They are really not new, many of our customers will have discussed these with us many times before. In fact, there’s no coincidence that the new Enterprise Data Center vision was launched the same day as the new IBM Z10 mainframe. We started discussing and talking about these these when I worked for Enterprise Systems in 1999, and we formally laid the groundwork in the on demand strategy in 2003. In fact, I see the Clabby paper has used the on demand operating environment block architecture to illustrate the service patterns. Who’d have guessed.

Simplify: reduce costs for infrastructure, operations and management

Share: for rapid deployment of infrastructure, at any scale

Dynamic: respond to new business requests across the company and beyond

However, the new Enterprise Data Center isn’t based on a mainframe, Z10 or otherwise. It’s about a style of computing, how to build, migrate and exploit a modern data center. Power Systems has some unique functions in both the Share and Dynamic stages, like partition mobility, with lots more to come.

For some further insight into the new data center vision, take a look at the presentation linked off my On a Clear day post from December.


About & Contact

I'm Mark Cathcart, formally a Senior Distinguished Engineer, in Dells Software Group; before that Director of Systems Engineering in the Enterprise Solutions Group at Dell. Prior to that, I was IBM Distinguished Engineer and member of the IBM Academy of Technology. I am a Fellow of the British Computer Society (bsc.org) I'm an information technology optimist.


I was a member of the Linux Foundation Core Infrastructure Initiative Steering committee. Read more about it here.

Subscribe to updates via rss:

Feed Icon

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 920 other followers

Blog Stats

  • 87,378 hits