Posts Tagged 'software'

Remembering the dawn of the open source movement

and this isn’t it.

attwood statistics 1975

Me re-booting an IBM System 360/40 in 1975

When I first started in IT in 1974 or as it was called back then, data processing, open source was the only thing. People were already depending on it, and defending their right to access source code.

I’m delighted with the number and breadth of formal organizations that have grown-up around “open source”. They are a great thing. Strength comes in numbers, as does recognition and bargaining power. Congratulations to the Open Source Initiative and everything they’ve achieved in their 20-years.

I understand the difference between closed source, (restrictive) licensed source code, free source, open source etc. The point here isn’t to argue one over the other, but to merely illustrate the lineage that has led to where we are today.

Perhaps one of the more significant steps in the modern open source movement was the creation in 2000 of the Open Source Development Labs, (OSDL) which in 2007 merged with the Free Standards Group (FSG) to become the Linux Foundation. But of course source code didn’t start there.

Some people feel that the source code fissure was opened when  Linus Torvalds released his Linux operating system in 1991 as open source; while Linus and many others think the work by Richard Stallman on the GNU Toolset and GNU License started in 1983, was the first step. Stallman’s determined advocacy for source code rights and source access certainly was a big contributor to where open source is today.

But it started way before Stallman. Open source can not only trace its roots to two of the industries behemoths, IBM and AT&T, but the original advocacy came from them too. Back in the early 1960’s, open source was the only thing. There wasn’t a software industry per se until the US Government invoked its’ antitrust law against IBM and AT&T, eventually forcing them, among other things, to unbundle their software and make it separately available as well as many other related conditions.

’69 is the beginning, not the end

The U.S. vs.I.B.M. antitrust case started in 1969, with trial commencing in 1975(1). The case was specifically about IBM blocking competitive hardware makers getting access and customers being able to run competitive systems, primarily S/360 architecture, using IBM Software.

In the years leading up to 1969, customers had become increasingly frustrated, and angry at IBM’s policy to tie it’s software to its hardware. Since all the software at that time was source code available, what that really meant was a business HAD to have one IBM computer to get the source code, it could then purchase an IBM plug-compatible manufacturers (PCM) computer(2) and compile the source code with the manufacturers Assembler and tools, then run the binaries on the PCM systems.

IBM made this increasingly harder as the PCM systems became more competitive. Often large previously IBM only systems users who would have, 2, 4, sometimes even 6 IBM S/360 systems, costing tens of millions of dollars, would buy a single PCM computer. The IBM on-site systems engineers (SE) could see the struggles of the customer, and along with the customers themselves, started to push back against the policy. The SE job was made harder the more their hands were tied, and the more restrictions that were put on the source code.

To SHARE or not to?

For the customers in the US, one of their major user groups, SHARE had
a vast experience in source code distribution, it’s user created content, tools tapes were legend, what most never knew, is that back in 1959, with General Motors, SHARE had its own IBM mainframe (709) operating system, the SHARE Operating System (SOS).

At that time there was formal support offerings of on-site SE’s that would work on problems and defects in SOS. But by 1962, IBM had introduced it’s own S/7090 Operating System, which was both incompatible with SOS, and also at that time IBM withdrew support by it’s SE and Program Support Representatives (PSR’s) to work on SOS.

1965 is where to the best of my knowledge is when the open source code movement, as we know it today, started

To my knowledge, that’s where the open source code movement, as we know it today, started. Stallman’s experience with a printer driver mirrors exactly what had happened some 20-years before. The removal of source code, the inability to build working modifications to support a business initiative, using hardware and software ostentatiously already owned by the customer.

IBM made it increasingly harder to get the source code, until the antitrust case. By that time, many of IBMs customers had created and depended on small, and large modifications to IBM source code.

Antitrust outcomes

Computerworld - IBM OCOBy the mid-70’s, once of the results of years of litigation, and consent decrees in the United States, IBM had been required to unbundle its software, and make it available separately. Initially it was chargeable to customers who wanted to run it on PCM, non-IBM systems, but overtime as new releases and new function appeared, even customers with IBM systems saw a charge appear, especially as Field Developed Programs, moved to full Program Products and so on. In a bid to stop competing products, and user group offerings being developed from their products, this meant the IBM Products were increasingly supplied object-code-only (OCO). This became a a formal policy in 1983.

I’ve kept the press cutting from ComputerWorld(March 1985) shown above since my days at Chemical Bank in New York. It pretty much sums-up what was going on at the time, OCO and users and user groups fighting back against IBM.

What this also did is it gave life to the formal software market, companies were now used to paying for their software, we’ve never looked back. In the time since those days, software with source code available has continued to flourish. With each new twist and evolution of technology, open source thrives, finds it’s own place, sometimes a dominant position, sometimes subservient, in the background.

The times in the late 1950’s and 60’s were the dawn of open source. If users, programmers, researchers and scientists had not fought for their rights then, it is hard to know where the software industry would be now.

Footnotes

(1) The PCM industry had itself come about as a result of a 1956 antitrust case and the consent decree that followed.

(2) The 1969 antitrust case was eventually abandoned in 1982.

Nobody wants to use…

Everyone wants to have everything. Bertil Muth has a great blog on software invisibility and use, where he asserts “Nobody wants to use software“.

Bertil makes a good case for AI driven software, that senses or learns why it exists, and just does what it should. Of course building such software is hard, very hard. It’s a good read though with some thought provoking points.

In the article when discussing Amazon he made a claim it was worth clarifying. It’s about the “infamous” 1-click patent. My comment is here.

“Then they [Amazon]pioneered 1-Click payment”
Actually they didn’t, they popularized a prior method, which after re-examination by the patent office was restricted to the use online, only in shopping carts.

The idea of a single click payment or financial transaction had been implemented many times before, however, prior to 1982 software patents were extremely hard to get for individual functions of so-called unique concepts, and were reserved for much broader, unique “inventions”.

In 1984, I was one of many working on Chemical Banks Pronto Home Banking System. For transfers between accounts within the bank, we initiated a 1-click on the UI for the PC Junior version of Pronto.

As far as I’m aware, nothing from Pronto was patented due to the high cost at the time. It wasn’t until the late 1980’s software patents started to be filed for individual methods, by the mid-90’s software patents became commonplace, and their use both defensive and offensive, sadly became commonplace too.

Overall though, it’s an excellent post which resonates with many of the themes of simplicity and usability I’ve argued here and elsewhere over the years.

The app hell of the future

Just over 5-years ago, in April 2011, I wrote this post after having a fairly interesting exchange with my then boss, Michael Dell, and George Conoly, co-founder and CEO of Forrester Research. I’m guessing in the long term, the disagreement, and semi-public dissension shut some doors in front of me.

Fast forward 5-years, and we are getting the equivalent of a do-over as the Internet of Things and “bots” become the next big thing. This arrived in my email the other day:

This year, MobileBeat is diving deep into the new paradigm that’s rocking the mobile world. It’s the big shift away from our love affair with apps to AI, messaging, and bots – and is poised to transform the mobile ecosystem.

Yes, it’s the emperor’s new clothes of software over again. Marketing lead software always does this, over imagines what’s possible, under estimates the issues with building in and then the fast fail product methodology kicks-in. So, bots will be the next bloatware, becoming a security attack front. Too much code, forced-fit into micro-controllers. The ecosystem driven solely by the need to make money. Instead of tiny pieces of firmware that have a single job, wax-on, wax-off, they will become dumping ground for lots of short-term fixes, that never go away.

Screenshot_20160524-113359Meanwhile, the app hell of today continues. My phone apps update all the time, mostly with no noticeable new function; I’m required to register with loads of different “app stores” each one a walled garden with few published rules, no oversight, and little transparency. The only real source of trusted apps is github and the like where you can at least scan the source code.IMG_20160504_074211

IMG_20160504_081201When these apps update, it doesn’t always go well. See this picture of my Garmin Fenix 3, a classic walled garden, my phone starts to update at 8:10 a.m., and when it’s done, my watch says it’s now 7:11 a.m.

IMG_20160111_074518Over on my Samsung Smart TV, I switch it from monitor to Smart TV mode and get this… it never ends. Nothing resolves it accept disconnecting the power supply. It recovered OK but this is hardly a good user experience.

Yeah, I have a lot of smart home stuff,  but little or none of it is immune to the app upgrade death spiral; each app upgrade taking the device nearer to obsolescence because there isn’t enough memory, storage or the processor isn’t fast enough to include the bloated functions marketing thinks it needs.

If the IoT and message bots are really the future, then software engineers need to stand up and be counted. Design small, tight reentrant code. Document the interfaces, publish the source and instead of continuously being pushed to deliver more and more function, push back, software has got to become engineering and not a form of story telling.

YesToUninstallAnUpdate[1]

Mainframe Assembler Language 2.0

Those that still follow my blog from my days working in the IBM mainframe arena might be interested in the following.

One of the stalwarts of software at IBM, and self described grand poobar of High Level Assembler, John R. Ehrman has a 1300-page 2.0 version of his book “Assembler Language Programming for IBM System z™ Servers ” and it’s available in PDF form here. There are a wealth of other assembler resources that John has contributed here on ibm.com

The Open Mainframe Project

It would be remiss of me not to mention another new Linux Foundation project, the Open Mainframe project. I’m actually be pretty interested, from a purely personal perspective, to see what this project does and where they plan to take Linux on the mainframe.

I’m glad to see that both Linux on the mainframe, and the ecosystem is still thriving. Having been involved with it heavily back in the late 90’s, and writing essentially the only public strategy in the original and republished IBM Redbook “Linux for S/390”. The first four chapters were mine.

I can recall with great fondness discussing with them head of IBM Systems Group, and future IBM CEO, Sam Palmisano and many others, the real reason Linux would be key to future success, it’s freedom. With India and China coming on stream as technology powerhouses, with millions of future programmers, it was clear that they would learn on Linux.

Even Windows was still the most pervasive operating system in 1998-2000, it was clear from anyone who understood technical people that Linux would influence not jut code, but threading, languages, library structures, call interfaces and more at the system level. For no other reason than people can study the source, learn from it, adapt it etc. and that was a train IBM couldn’t stop, we needed to be on board before the train left without us. There is a good NY Times article from the period here.

Good luck to the Open Mainframe project.

Dell Software Official Site – Simplify IT Management

We’ve released our latest web presence for the Dell Software group, it’s got direct download links, try and/or buy, easy to find information and more.

Dell Software Official Site – Simplify IT Management | Mitigate Risk | Accelerate Results.

Software at Dell

Come work with us

We’re looking for a Senior Software Development Performance Engineer with a proven track record around performance engineering to join our Dell Software Common Engineering (CE) performance team. This team, as part of the Office of the CTO, helps product teams across the Software Group achieve their performance and scalability objectives through direct involvement and consulting engagements.

We are working on some key forward looking technologies essential to the future of Dell Software Group, and we need someone specifically to help on both performance design, as well as recommendations to the design and implementation teams.

Here is a direct link for the job application with a referral from me. As an Exec. I don’t qualify for the referral bonus, so if you know another Dell employee who isn’t an exec. feel free to reach out to them!


About & Contact

I'm Mark Cathcart, formally a Senior Distinguished Engineer, in Dells Software Group; before that Director of Systems Engineering in the Enterprise Solutions Group at Dell. Prior to that, I was IBM Distinguished Engineer and member of the IBM Academy of Technology. I am a Fellow of the British Computer Society (bsc.org) I'm an information technology optimist.


I was a member of the Linux Foundation Core Infrastructure Initiative Steering committee. Read more about it here.

Subscribe to updates via rss:

Feed Icon

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 797 other followers

Blog Stats

  • 85,235 hits