Archive for the 'software' Category

Mysterious Disappearing MAC Address

One of my systems applied a Windows 10 updates on Friday, it runs attached to my TV, and so while not headless(ie. no attached monitor) it often runs for days without the UI visible. So there it was, has anyone ever clicked “Let’s Go”?

The system wasn’t connected the Internet? Puzzling, since it has a 1Gb wired connection into a switch, that goes straight to the 1Gb Fibre Optic cable modem, and everything else was working.

Choose Adapter settings > Disable > Enable > Wait > Identifying Network... > No Network Connection.

Next up was a CMD prompt and IPCONFIG /ALL

Strangely, it reported the IP V4 address as 169.x.x.x – no DNS etc. Then I spotted it, Physical Address: 00-00-00-00-00-00
Huh?

I tried all the usual things:

Disable Adapter > Delete Driver > Shutdown/Reboot

and variations of that. Then went ahead and started searching on the web, that was as helpful as it always. They only thing I learned is, I was far from alone. Especially with Realtek PCIe GBE Family Controller users. I downloaded their device diagnostics, and everything ran clean, and right there in the diagnostics window was the supposedly zero-d out MAC/Physical address.

VPN Software?

I checked with the support team with NORDVPN that runs on that system, they assured me they do NOT change the MAC address, or use any form of MAC Spoofing in their software.

No Connectivity

The reason for the Internet connectivity issue, is that the cable modem I use, will not give out DNS data and assign an IP address to a device that is not on the list of devices I maintain.

Among the various reports of issues relating to this, I found this one. So there is every possibility that it was a #Windows #WIN10 update that screwed up the MAC address which is stored in the registry, who knows? Also, everyone of the posts I found recommended an app to store and update a new MAC Address. I’m not a big fan of either using REGEDIT and downloading and installing random apps to update the registry.

Setting a MAC Address in Windows 10

Turns out you don’t need to. If you go into the properties for the adapter, and scroll through them, you’ll come to the “Network Address”.

In the value field should be the same MAC address that is on the label that came with the PC. It also should match the MAC address you can find in the BIOS if you want to go rooting around in there. If you have the MAC address, for example, from PC Hardware case, you can simply add that back in at (3) above and select (4) OK. Just make sure you get the correct MAC address, don’t duplicate one already on your network, and don’t use the MAC address from your Wifi adapter, for your Ethernet adapter.

A Picture I found online. Don’t do this especially with a label that includes your Dell Service tag… trust me on that.

You can also look up your provider MAC address prefixes, here, and make a new one. Again, the MAC address does need to be unique. In my case I had the original, but decided while working through this issue to use a MAC Address starting with FCCF62 and is from a block assigned to “IBM Corp”. Since I don’t have any IBM devices on my network and am unlikely to have any.

My system has been fine since fixing this. The change survived through a couple of reboots, and I re-installed NORDVPN and it’s also working fine.

Why Post?

First obviously was to document what I’d done; Second was to share what had happened and how I resolved it; Third was in hope someone would post with a logical discussion of how this happened, and also, how I could have resolved it more simply and quickly.

I remain amazed that the Realtek diagnostics, a). loaded their own MAC address from Windows registry and b). didn’t at least recognize the MAC address wasn’t from a block they own?

#HEARTBLEED was 5-years ago.

I was reading through my old handwritten tech notebooks this morning, search for some details on a Windows problem I know I’ve had before. I noticed an entry for March 28th, 2014 on the latest bug tracker list from Red Hat. One of the items on the list from the week before was the #Heartbleed bug in OpenSSL.

heartbleed-twoway-featured[1]

Image from synopsis.com

In less than a couple of weeks, Jim Zemlin from the Linux Foundation contacted John Hull in the open source team at Dell, who passed the call to me. I was happy to tell Jim we’d be happy to sign up, I got voice approval for the spending commitment and the job was done.

The Core Infrastructure Initiative (CII) was announced on April 24th, 2014. One of the first priorities was how to build a more solid base for funding and enabling open source developers. The first projects to receive funding were announced on April 26th, 2014 with remarkable speed.

Five years later I’m delighted to see Dell are still members, along with the major tech vendors, especially and unsurprisingly, Google. Google employees have made both substantial commitments to CII and open projects in general. I remember with great appreciation many of the contributions made by the tehn steering committee members, especially, but not limited to Ben Laurie and Bruce Schneier.

This blog, on synopsis.com, has a summary, entitled Heartbleed: OpenSSL vulnerability lives on. May 2, 2017.

My blog entries on Heartbleed and CII are here, here, and here.

There is still much to be concerned about. There are still many unpatched Apache HTTPD servers, especially versions 2.2.22 and 2.2.15 accessible on the Internet.

Remember, just because you don’t see software, it doesn’t mean it isn’t there.

Serverless computing

I’ve been watching and reading on developments around serverless computing. I’ve never used it myself so only have limited understanding. However, given my extensive knowledge of servers, firmware, OS, Middleware and business applications, I’ve had a bunch of questions.

serverlessnyc

Many of my questions are echoed in this excellent write-up by Jeremy Daly on the recent Serverless NYC event.

For traditional enterprise type customers, it’s well worth reviewing the notes of the issues highlighted by Jason Katzer, Director of Software Engineering at Capital One. While some attendees talk about “upwards of a BILLION transactions per month” using serverlesss, that’s impressive, that’s still short of many enterprise requirements, it translates to 34.5-million transactions per day.

Katzer notes that there are always bottlenecks and often services that don’t scale the same way that your serverless apps do. Worth a read, thanks for posting Jeremy.

Open Source redux

While I don’t update here much anymore that’s mostly because I’ve not been active in the general technology scene for the last 2.5 years following my departure from Dell and the resultant non-compete. I’m taking a few easy steps back now, I’ve reactivated my British Computer Society (BCS) Fellow membership and am hoping to participate in their Open Source Specialist Group meeting and AGM on October 25th.

MS-DOS Open Source

msdos-logo-150x150[1]Interestingly, Microsoft have announced they are re-open sourcing the code for MS-DOS 1.25 and 2.0 releases. Although never available outside of Microsoft or IBM in its entirety, there were certainly sections o the code floating around in the mid-1980’s. I was given the code for some drivers in 1984 by an IBM Systems Engineer, which I proceeded to hack and use as a starter for the 3270 driver I used for file transfer.

I’ve got a copy of the code released by Microsoft, and other the next 6-months am going to set about compiling it and working to get to work on a PC as a way to re-introduce myself to working in PC Assembler and the current state of compilers.

The Zowe Open Source Project

This was announced today at SHARE St Louis. A great new effort and opportunity to integrate open source technologies and applications into the IBM z/OS operating system. Zowe, as the article says, is

a framework of software services that offers industry standard REST APIs, API catalog, extensible command line interface and web-based UI framework

They’ve also put together the zowe,org community for architects, developers and designers to share best practices. It’s not clear what the legal relationship is between the open mainframe project and zowe, but zowe is listed as a project, so that’s great news in terms of strategy and direction. As of writing, the open mainframe zowe project web page has the best detail on the project.

Zowe appears to be a collaboration between IBM and a number of companies, including Rocket Software. Rocket has a broad portfolio of software and systems that integrate with IBM Systems, they also have my friend, former colleague and sparing partner at IBM, Jim Porell on staff.

Digital Copiers, Faxes and MFP’s and their hard drives

I’m a subscriber to long time UK Tech journalist and Blogger, Charles Arthur / @charlesarthur Overspill blog where he currates links etc. Recently, he linked to an old report, from 2010, but it’s always worth reminding people of the dangers of photocopiers, fax machines and multi-function printers, especially older ones.

Copiers that are lightly used often have a lifecycle of 10-15 years. If you buy rather than lease, it’s quite possible you still have one that doesn’t include encryption of the internal hard drive. Even with a encrypted drive, there is still potential to hack the device software and retrieve the key, although pretty difficult.

The surprise thing is that many modern Multi-function Printers (MFP) also have local storage. While in modern models it is not an actual hard drive, it is likely to be some form of onboard flash memory ala cell phone memory, either part of the system board or via an embedded SD card. It’s worth remembering that these machines are Fax, copier, printers, and scanners all in one machine.

The US Federal Trade Commision has a web page that covers all the basics, in plain language.

Whatever the device, it is still incumbent on the owner to ensure it is wiped before returning it, selling it, or scrapping it. PASS IT ON!

For those interested in how you can get data from a copier/MFP type device, Marshall University Forensic Science team has a paper, here.

Remembering the dawn of the open source movement

and this isn’t it.

attwood statistics 1975

Me re-booting an IBM System 360/40 in 1975

When I first started in IT in 1974 or as it was called back then, data processing, open source was the only thing. People were already depending on it, and defending their right to access source code.

I’m delighted with the number and breadth of formal organizations that have grown-up around “open source”. They are a great thing. Strength comes in numbers, as does recognition and bargaining power. Congratulations to the Open Source Initiative and everything they’ve achieved in their 20-years.

I understand the difference between closed source, (restrictive) licensed source code, free source, open source etc. The point here isn’t to argue one over the other, but to merely illustrate the lineage that has led to where we are today.

Perhaps one of the more significant steps in the modern open source movement was the creation in 2000 of the Open Source Development Labs, (OSDL) which in 2007 merged with the Free Standards Group (FSG) to become the Linux Foundation. But of course source code didn’t start there.

Some people feel that the source code fissure was opened when  Linus Torvalds released his Linux operating system in 1991 as open source; while Linus and many others think the work by Richard Stallman on the GNU Toolset and GNU License started in 1983, was the first step. Stallman’s determined advocacy for source code rights and source access certainly was a big contributor to where open source is today.

But it started way before Stallman. Open source can not only trace its roots to two of the industries behemoths, IBM and AT&T, but the original advocacy came from them too. Back in the early 1960’s, open source was the only thing. There wasn’t a software industry per se until the US Government invoked its’ antitrust law against IBM and AT&T, eventually forcing them, among other things, to unbundle their software and make it separately available as well as many other related conditions.

’69 is the beginning, not the end

The U.S. vs.I.B.M. antitrust case started in 1969, with trial commencing in 1975(1). The case was specifically about IBM blocking competitive hardware makers getting access and customers being able to run competitive systems, primarily S/360 architecture, using IBM Software.

In the years leading up to 1969, customers had become increasingly frustrated, and angry at IBM’s policy to tie it’s software to its hardware. Since all the software at that time was source code available, what that really meant was a business HAD to have one IBM computer to get the source code, it could then purchase an IBM plug-compatible manufacturers (PCM) computer(2) and compile the source code with the manufacturers Assembler and tools, then run the binaries on the PCM systems.

IBM made this increasingly harder as the PCM systems became more competitive. Often large previously IBM only systems users who would have, 2, 4, sometimes even 6 IBM S/360 systems, costing tens of millions of dollars, would buy a single PCM computer. The IBM on-site systems engineers (SE) could see the struggles of the customer, and along with the customers themselves, started to push back against the policy. The SE job was made harder the more their hands were tied, and the more restrictions that were put on the source code.

To SHARE or not to?

For the customers in the US, one of their major user groups, SHARE had
a vast experience in source code distribution, it’s user created content, tools tapes were legend, what most never knew, is that back in 1959, with General Motors, SHARE had its own IBM mainframe (709) operating system, the SHARE Operating System (SOS).

At that time there was formal support offerings of on-site SE’s that would work on problems and defects in SOS. But by 1962, IBM had introduced it’s own S/7090 Operating System, which was both incompatible with SOS, and also at that time IBM withdrew support by it’s SE and Program Support Representatives (PSR’s) to work on SOS.

1965 is where to the best of my knowledge is when the open source code movement, as we know it today, started

To my knowledge, that’s where the open source code movement, as we know it today, started. Stallman’s experience with a printer driver mirrors exactly what had happened some 20-years before. The removal of source code, the inability to build working modifications to support a business initiative, using hardware and software ostentatiously already owned by the customer.

IBM made it increasingly harder to get the source code, until the antitrust case. By that time, many of IBMs customers had created and depended on small, and large modifications to IBM source code.

Antitrust outcomes

Computerworld - IBM OCOBy the mid-70’s, once of the results of years of litigation, and consent decrees in the United States, IBM had been required to unbundle its software, and make it available separately. Initially it was chargeable to customers who wanted to run it on PCM, non-IBM systems, but overtime as new releases and new function appeared, even customers with IBM systems saw a charge appear, especially as Field Developed Programs, moved to full Program Products and so on. In a bid to stop competing products, and user group offerings being developed from their products, this meant the IBM Products were increasingly supplied object-code-only (OCO). This became a a formal policy in 1983.

I’ve kept the press cutting from ComputerWorld(March 1985) shown above since my days at Chemical Bank in New York. It pretty much sums-up what was going on at the time, OCO and users and user groups fighting back against IBM.

What this also did is it gave life to the formal software market, companies were now used to paying for their software, we’ve never looked back. In the time since those days, software with source code available has continued to flourish. With each new twist and evolution of technology, open source thrives, finds it’s own place, sometimes a dominant position, sometimes subservient, in the background.

The times in the late 1950’s and 60’s were the dawn of open source. If users, programmers, researchers and scientists had not fought for their rights then, it is hard to know where the software industry would be now.

Footnotes

(1) The PCM industry had itself come about as a result of a 1956 antitrust case and the consent decree that followed.

(2) The 1969 antitrust case was eventually abandoned in 1982.

IoT App hell of the future

On the day after it was revealed that some models of the Google Home Mini speaker was revealed to be recording voices 24/7 due to a defect, Danny Palmer has a thoughtful piece on ZDNet about the toxic legacy of IoT devices.

Danny is spot-on about the social and technological impact of connected devices past their support date. While I’ve complained in the past about constantly updating apps, both adding function that slows the original device, and removing function that changes, often destroys the original value proposition of the device. It’s perhaps when the devices stop getting updates we have the most to fear from?

I have a Netgear NAS that is out of support, in fact, since I have an identical NAS that wakes-up Tuesdays at 2am and backs-up the primary NAS, I have two of them. While they are out of support, Netgear has been good at fixing urgent vulnerabilities. Of course, since I can’t see the source, I don’t know what vulnerabilities they have not fixed.

Kate and I went to see Blade Runner 2049 on the opening day at the local AMC cinema. It’s a bit of a thing of mine to sit through ALL, and I mean all of the end credits, As we left the theater, there it was, right at the very bottom of the screen, unseen from the seats, the Windows XP Start-button. I have no idea what projector they were using, but yes, many projectors did, and obviously still do run Windows XP.

Do you own the device you just bought?


Professor of Law, Washington and Lee University, has a great blog post that echoes exactly the same sentiments I heard Richard Stallman explain his original drive for open source, way back in the 1980’s.

Fairfield argues that we don’t own the devices we buy, we are merely buying a one-time license to the software within them. He makes a great case. It’s worth the read.

One key reason we don’t control our devices is that the companies that make them seem to think – and definitely act like – they still own them, even after we’ve bought them. A person may purchase a nice-looking box full of electronics that can function as a smartphone, the corporate argument goes, but they buy a license only to use the software inside. The companies say they still own the software, and because they own it, they can control it. It’s as if a car dealer sold a car, but claimed ownership of the motor.

My favorite counter-example of this is the Logitech Squeezebox network music player system I use.  Originally created by Slim Devices, as far back as 2000, with their first music player launched in 2001. Slim Devices were acquired by Logitech in 2006, who then abandoned the product line in 2012.

I started using Logitech Squeezebox in 2008, first by buying a Squeezebox Boom, then a Radio, another Boom, a Touch and have subsequently bought used Duet, and for my main living room, the audiophile quality Transporter.

While there are virtually no new client/players, there is a thriving client base built around the Raspberry Pi hardware with both client software builds and add-on audio hardware, as well as server builds to use the Pi. I’ve hacked some temporary preferences into the code to solve minor problems, but by far the most impressive enhancements to the long abandoned, official, server codebase are the extensions to keep up with changes in streaming services like the BBC iPlayer radio, Spotify, DSD play and streaming and many more enhancements. For any normal, closed source platform any one of these enhancements would likely have been impossible, and for many users made the hardware redundant.

The best place to start in the Squeezebox world is over on the forums, hosted, of course, at http://forums.slimdevices.com/

When my 1-month Ring (video) doorbell failed. It was all I could do to get Ring to respond. I spent nearly 4-hours on the phone with tech support. Not only did I have no control, the doorbell had stopped talking to their service, but they couldn’t really help. After the second session with support, I just said “look I’m done can you send a replacement?” – The tech support agent agreed they would, but 10-days later I was still waiting for even a shipping notice, much less a replacement. While the door bell worked as a door bell, none of the services, motion detection, door bell rings were any good as their services were unavailable to my door bell.

You don’t have to give up control when you buy a new device. You do own the skeleton of the hardware, buy you’ll have to make informed choices, and probably will give up control, if you want to own the soul of the machine, it’s software.

Nobody wants to use…

Everyone wants to have everything. Bertil Muth has a great blog on software invisibility and use, where he asserts “Nobody wants to use software“.

Bertil makes a good case for AI driven software, that senses or learns why it exists, and just does what it should. Of course building such software is hard, very hard. It’s a good read though with some thought provoking points.

In the article when discussing Amazon he made a claim it was worth clarifying. It’s about the “infamous” 1-click patent. My comment is here.

“Then they [Amazon]pioneered 1-Click payment”
Actually they didn’t, they popularized a prior method, which after re-examination by the patent office was restricted to the use online, only in shopping carts.

The idea of a single click payment or financial transaction had been implemented many times before, however, prior to 1982 software patents were extremely hard to get for individual functions of so-called unique concepts, and were reserved for much broader, unique “inventions”.

In 1984, I was one of many working on Chemical Banks Pronto Home Banking System. For transfers between accounts within the bank, we initiated a 1-click on the UI for the PC Junior version of Pronto.

As far as I’m aware, nothing from Pronto was patented due to the high cost at the time. It wasn’t until the late 1980’s software patents started to be filed for individual methods, by the mid-90’s software patents became commonplace, and their use both defensive and offensive, sadly became commonplace too.

Overall though, it’s an excellent post which resonates with many of the themes of simplicity and usability I’ve argued here and elsewhere over the years.

Woe are apps

As a follow-on to my recent app post, a couple of interesting udates. First up, marketplace.org ran an interesting piece on apps on June 9th. Sabri Ben-Achour covered the Apple iTunes announcement by saying:

  • It’s hard for app developers to get noticed(thats a “no shit sherlock” moment)
  • It’s hard to make money (thats NSS #2)
  • There are 1.6 million apps on the Apple store, the search function isn’t that great
  • There have been 75 billion app downloads, but the average user downloads zero apps per month.

Apples answer? Paid promotion within the iTunes store. Of course if apps didn’t exist and companies and developers were using the power of mobile through web, css etc. their sites would be found in context of content and SEO. They could focus their efforts in a single way to promote their content and the web UI to access it.

Also new, to me, I went to use Skype to contact one of my kids in Europe the other day and was surprised, and more than a little disappointed to find the Skype app was no longer working and no longer available. It’s not clear if this was a business decision, or a technology one. The app was the only one I ever used on the Samsung SmartTV that used the camera. Yeah, I know I should have taped over the camera.

That’s the problem with apps, you wait for ages for a platform that makes sense, and then two or more come along at the same time. You better hope you pick the right one. There are some 137 pages on a single thread on the Skype Community forums debating if either Skype or Samsung was the wrong platform.

Apps

Mainframe Assembler Language 2.0

Those that still follow my blog from my days working in the IBM mainframe arena might be interested in the following.

One of the stalwarts of software at IBM, and self described grand poobar of High Level Assembler, John R. Ehrman has a 1300-page 2.0 version of his book “Assembler Language Programming for IBM System z™ Servers ” and it’s available in PDF form here. There are a wealth of other assembler resources that John has contributed here on ibm.com

Touch screen and the desktop

I just posted a response over on a CNET discussion topic. As often is the case, rather than write, review, edit and post; I banged away a response and submitted, as always I made a few typo’s, so here is a corrected version.

I’ve just retired from an senior engineering position at Dell, specializing in software and firmware but I also participated in a number of usability studies for hardware/software combinations. I was the originator of the NFC enabled server systems management concept. I’d offer a few thoughts to confirm what some others have said, but also a slightly different perspective.

1. yes reaching across a keyboard to a monitor mounted at the back of a desk is ergonomically unpleasant.

2. Touch is an interesting technology, but for fixed monitors and TV’s etc. it is less than optimal. There are numerous efforts underway to come up with a more responsive, natural way to control a UI. Think X/BOX or Nintendo, or the Samsung SmartTV gestures, voice ala Amazon echo etc.

3. That said, I for one would never go back to a non-touch laptop screen. I can lift my arm from the keyboard and prod the “submit post” button below much quicker that I can use the touchpad, or grab an extrnal mouse and click.

4. If you want a touch screen desktop I’d highly recommend getting an all-in-one with a touch screen and mounting it into a desk. I had one of the Dell XPS 27’s and had an IKEA draftmans desk. We cut a hole 99% the size of the screen; mounted the screen into the hole; secured it with picture wire in a # format across the back. I gave up using a physical keyboard and mouse, bought a Targus Stylus and went 100% touch. The advantage of the IKEA desk is that you can easily angle the surface to one that suits you. Also, it came with a medal lip which stopped things sliding off the edge; also it came with a built in glass area, which was great for to-do lists, notes etc.

One final note, on Touch screen PC’s. As with Windows 10, when switching over to touch screen you have to try to stop doing the way you did them with a mouse and keyboard. The Adobe PDF app for Windows 10, is much easier to use than the Adobe desktop app for Windows 10. Using a drawing program for line art, block diagrams etc. either with your finger, or with a stylus is a huge leap forward to messing about with Word and Powerpoint. In the case of slides, and powerpoint, it made me released me from decades of serial text mode slides.

So rather than ask why so few touch screens for desktop computers. Ask, what are top-5 applications I use, and how could touchscreen make them better, easier, or me more productive. If it’s email, calendar and web browsing, it probably won’t. Although even in those cases, zoom in and zoom out is an improvement.

Linux Foundation Core Infrastructure Initiative

In a discussion recently I was asked about the Linux Foundation Converged Infrastructure Initiative and if it was still active?

Indeed it is, they’ve made some great progress on funding and supporting open source projects, and there are some interesting developments coming before the end of the year. CII has funded a number of projects through their grants process, you can read more of the some of the projects, and help with prioritization.

It’s not the nature of the CII to broadcast its’ work, the best measure of success are no vulnerabilities in the projects they are supporting. Projects funded following on from the initial OpenSSL, include:

  • Network Time Protocol (NTP)
  • GnuPG
  • OpenSSH
  • Debian Reproduceable Builds
  • The Fuzzing Project
  • False-Positive-Free Testing with Frama-C

Details of the grants etc. are here. Also, I’ve finally added my profile to the CII web site, as seen here

.CII Profile

Case story for Dell Software and Hardware

I’ve not posted much of late as I’m working on a lot of back office and process stuff, but still working in Dell Software Group. I recently attended the annual Dell Patent Award dinner where I was able to catch up with Michael Dell and my boss, John Swainson as well as a few other executives, as well many of the great innovators and inventors.

My former boss and Dell Vice President, Gerry Hackett made an interesting point in her remarks prior to doing the roll call for her team at the dinner, she said to the effect that Dell was going to be the only integrated solution provider. I was surprised, but thinking it through she was right.

When I saw this customer story about San Bernardino County School district, I thought it was worth linking here.


About & Contact

I'm Mark Cathcart, formally a Senior Distinguished Engineer, in Dells Software Group; before that Director of Systems Engineering in the Enterprise Solutions Group at Dell. Prior to that, I was IBM Distinguished Engineer and member of the IBM Academy of Technology. I am a Fellow of the British Computer Society (bsc.org) I'm an information technology optimist.


I was a member of the Linux Foundation Core Infrastructure Initiative Steering committee. Read more about it here.

Subscribe to updates via rss:

Feed Icon

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 916 other subscribers

Blog Stats

  • 89,365 hits