Let’s talk about the Metaverse

January 1st, 2023
Filed under: General | Huibert @ 5:19 pm

A couple of weeks ago I was invited by Meta to discuss IBM’s point of view about the Metaverse. While I don’t believe the technology is fully ready for prime time, the opportunities are intriguing and certainly worth exploring. From a back-end developer perspective, AR/VR are simply new channels that companies can enable to keep in touch with their customers wherever they are. While I do not believe that customers will spend many hours continuously wearing AR/VR headsets (at least the ones currently available), there are certainly tasks that will be best executed in a VR or mixed reality environment. That is where IBM will provide the technologies and tools to enable companies build realistic worlds that will appeal to their customers, using all kind of technologies such as scalable micro services, avatars, AI and new generations of chatbots. You can listen to the complete interview (in Spanish) in Episode 10 of the “Hablemos del Metaverso” podcast.


Hola Mundo Podcast

March 24th, 2022
Filed under: General | Huibert @ 1:08 pm

Speaking of public speaking, I have recently been invited by DaCodes, a Mexican company specialized in software development, to participate in their podcast to discuss my career. You can find the interview (in Spanish) on most podcast platforms. Here is the link to Spotify if you want to listen to the episode:


Working with universities

March 24th, 2022
Filed under: General | Huibert @ 12:54 pm

I haven’t written much in this blog over the last few years but I have been very active working with schools and universities as part of my current role at IBM. This is something that I enjoy quite a bit and it has allowed me to meet a lot of wonderful people devoted to education which is a subject that has always fascinated me, starting in the mid-80s when I wrote Teacher’s Wizard, an early courseware app for the Apple II.

Universities are eager to get help from the leading companies like IBM to talk about industry trends and hot new technologies. Over the last two years those topics include Digital transformation, Cloud Computing, Quantum Computing, AI, Kubernetes / OpenShift, etc. I have been fortunate enough to be invited by many leading Mexican universities to discuss these and other topics with their students and faculty members.

Unfortunately, due to the pandemic, most of the presentations I have given over the last two years have been remote. On the bright side, some of those can now be found on YouTube (all in Spanish):


Old Apple II software preserved

January 8th, 2019
Filed under: General | Huibert @ 1:32 pm

Apple Rainbow Logo

A couple of months ago I returned to my parents home in Madrid, Spain and was able to find some of the early code I wrote for the Apple II. I wasn’t very confident my 5 1/4” floppy disks could have survived 30 years in storage but I decided to send them to my friends Antoine Vignau (from Brutal Deluxe Software) anyway. Some of the disks were damaged but I had enough backup copies that eventually both Teacher’s Wizard and G.A.P.E. could be fully recovered. I was quite amazed to be able to load these products I created over 30 years ago on my Mac and run them in an emulator.

G.A.P.E. (Global Applesoft Program Editor) was an Applesoft editor much like Call A.P.P.L.E.’s G.P.L.E. based on an editor I used on a PR1ME micro-computer when living in Geneva in the early 80’s. I submitted this program in 1985 to a contest organized by Philips in Europe called the Holland Prize (now European Union contest for young scientists) ). Although I didn’t win, it was a great experience and G.A.P.E became my first “professional quality” software title.

Teacher’s Wizard was a tool for teachers that allowed them to easily create courseware. It was quite sophisticated for the time because it could be used with a mouse and incorporated many of the same concepts that would later be made popular by Hypercard. This program was originally developed for Edelvives, a Spanish book editor that worked closely with many schools. I later sold the rights for the rest of the world to Britannica Software.

Both programs can now be freely downloaded.


Old IIGS software recovered

November 9th, 2016
Filed under: General | Huibert @ 1:55 pm

Open Apple

A couple of months ago, Antoine Vignau helped me recover the contents of my old HD 20SC hard drive. The disk was in very bad shape but he was still able to image it and most of the contents could be recovered. What really surprised me was that I was able to recover two programs I wrote in the late eighties.

Among the many interesting things that I found on that disk was an old NDA (New Desk Accessory) that I wrote back in 1988. If you have been using your IIGS with a French keyboard and hoped for better support for accented characters, AZERTY may help you.

The other product I was able to rescue was Jigsaw Deluxe, an improved version of my first Apple IIGS game, Jigsaw! This new version adds several new features that make it more fun.

Download and enjoy!


Open Apple Interview

May 9th, 2016
Filed under: Apple, Apple IIgs, General | Huibert @ 6:04 pm

Open Apple

A couple of weeks ago I was interviewed by Mike Maginnis and Quinn Dunki from the Open Apple podcast, a monthly show about the Apple II. I had a lot of fun sharing some of the stories behind the development of SoundSmith and some of my other Apple II titles. I realize that not many people are interested in vintage computing, but if you are my age and had the pleasure to enjoy the early days of personal computing, you may be interested in listening.

Here is the link to the episode.

You may also be interested in subscribing and listen to older episodes. I particularly enjoyed the ones with Bill Budge (of Raster Blaster fame) and Mike Westerfield (The byte works), but there are many others also worth listening too.


Java is almost 20 years old

July 29th, 2015
Filed under: Enterprise Architecture, General, IBM, Internet, Java | Huibert @ 8:08 pm

Images

It is hard to believe, but Sun Microsystems released Java 1.0 almost 20 years ago, on January 23rd, 1996. I was an early adopter because I was intrigued by the write once, run everywhere promise. At the time, developing for the Mac did not look like a viable career option and yet I did not like the idea of having to switch platforms. As a result, Java seemed a great option.

Java 1.0 couldn’t do much beyond producing animated content for the browser, but it was easy to learn, mostly because it had so few libraries available at launch. In fact, this was part of the excitement, as so many basic building blocks had to be created in order to allow other developers to build more powerful applications. As an example, during Java’s early days I wrote a GUI for applets (Swing didn’t exist and AWT sucked), inspired by the classic MacOS toolkit. I called it MAE for Java. It was a lot of fun.

Over the next few years, Java grew up. In 1997, version 1.1 added JDBC and the servlet standard, which paved the way for the application server era. I discovered WebLogic at the second ever JavaOne event in San Francisco and even though the product had limited capabilities at the time, it was clear to me that the concept had a lot of potential. Java was quickly becoming a solid platform and a serious contender in the Enterprise world. Over the following years, the J2EE spec (now simply known at JEE) continued to mature in order to address an increasingly large array of IT requirements (SOA, Web development, encryption, MQ integration, O/R mapping, etc.). For those of us who got on the bandwagon early, adopting these technologies was easy. We just had to learn a couple of new APIs each year, a pace which, with hindsight, now seems quite reasonable. Everything was great. So great in fact that I credit Java for developing a whole generation of IT Architects. I am of course talking about senior professionals who usually have somewhere between 15 to 20 years of experience, not the kids just hired out of school by consulting firms and labeled “Architects” to justify higher hourly rates.

So, how can someone become an architect today? It all starts by learning the right programming language. Some languages, like Visual Basic (let’s use a dead language as an example to avoid offending anyone), are great for quickly building specialised solutions, but won’t help you with your career. I for one have never met a CTO or CIO who got his/her job after a successful and gratifying career as a Visual Basic programmer. On the other hand, Java was designed from the beginning as a general purpose language, designed to build any kind of application. Sun Microsystems, which was on a mission to conquer the world, wanted their language to be used for everything, from embedded systems to large distributed enterprise applications. To achieve that goal, they enlisted most of the IT industry leaders (IBM, SAP, Oracle, etc.) to help them provide Java developers with a large selection of rich, stable and supported APIs as well as solid developer tools like Eclipse. The results achieved by this broad industry alliance have simply been amazing. Twenty years later, no other computer language comes even close to the level of versatility Java offers today. Engineers who grew up with Java got progressively exposed to a large number of technologies which allowed them in turn to grow their own career and eventually become Architects or CTOs.

Despite Java’s undeniable success, something weird started to happen somewhere between the releases of J2EE 5 (2006) and JEE 6 (2009). People started to label Java as “heavy”, “complex” and “hard to learn”. Sure, the fact that Oracle bought Sun in 2010, adding fear and uncertainty to the future of the platform, did not help, but this trend started well before the acquisition. Learning Java was becoming increasingly hard for beginners. In my opinion, this doesn’t speak ill of Java, but it does raise some serious questions on how we should teach complex platforms to beginners, an issue we definitively haven’t solved yet. That said, perception is reality and interest in Java started to dwindle, despite the success of Android.

Over the last few years, countless new programming languages have appeared and are now fighting for our attention. Some are great, others not so much. The problem is that, in order to become viable alternatives to Java, specially in the enterprise, these languages will need to mature. Even if the language itself may not have to evolve significally, the plaform will need to grow. In order to solve complex problems, new APIs will have to be built and over time these platforms will inevitably become as complex as Java is now.

I am not saying that we shouldn’t try to replace Java with a better alternative because complexity will inevitably creep into any successful development platform, on the contrary. A better programming language with a strong API library could be a significant boon for developers. I absolutely believe that a better programming language can make us more productive. However, adopting a better language does not make complex projects significantly simpler. You may be able to use less lines of code to achieve you goal, avoid potential errors or even simplify the development of multithreaded code, but in the end, hard problems remain hard to solve and require experienced professionals that can design complex systems. A better language is great but it is pretty much useless if it does not have the APIs enterprise developers require.

I would love to have a true alternative to Java. It would be great if I could use a modern non-propietary language such as JavaScript, Go, Rust or Swift to write any kind of system. However, none of these great languages will become serious contenders in the enterprise unless we give them the opportunity to mature. This requires strong stewardship and industry support. That is why my money right now is on JavaScript as the most likely successor to Java, specially now that the technology is backed by industry heavyweights IBM and Microsoft. That said, JavaScript has still a long way to go before it can compete with Java, and that concerns me. The main problem we face right now is the limited attention span of the developer community. We tend to jump too quickly from one hot technology to the next one. Hadoop has been red hot for the last few years, but now Apache Spark seems to have taken a lot of wind out of it’s sails. In the relational database space, mySQL seemed to be about to take on Oracle and DB2, but now interest in open-source RDBMS is waning in favor of multiple No-SQL databases (Cassandra, MongoDB, CouchDB, etc.). If the developer community does not stand strong behind JavaScript for several years, Java may not have a successor and young developers will not have a chance at building a successful career in enterprise IT.

For now, Java is still among the most popular languages out there, with a stable marketshare, but the situation is quickly changing with long time favorites such as C# and Objective-C quickly losing steam in favour of newcomers like JavaScript and Swift. Java is still very strong in the enterprise, but interest in that language is at an all time low among those who want to learn programming. If we want to build a new generation of IT Architects and CTOs, we need a replacement for Java and we can’t wait for another 20 years.


The value of experience

August 25th, 2014
Filed under: Internet, Politics, Society | Huibert @ 9:13 am

Michael DanielIn an interview with the Information Security Media Group publication, White House cybersecurity coordinator Michael Daniel admits to having no practical experience with the subject matter. Daniel claims that “being too down in the weeds at the technical level could actually be a little bit of a distraction” to his job of advising the president about ongoing and emergent information security issues.

The White House filled the position with Daniel in May 2012, having previously served as the intelligence branch chief in the White House Office of Management and Budget. He believes that the lack of practical experience in the field is offset by masters’ degree in national resource planning and public policy degree. He also credits previous government experience for success in the position, augmented by his martial arts experience.

As the Electronista article states, Daniel isn’t responsible for the technical details of a fix or solution to a country-wide issue. Rather, his job is to assess the situation, and report to the president, and bring other agencies into the fold and “on the same page” about an issue. Senior fellow Jim Lewis at the think tank Center for Strategic and International Studies claims that the lack of experience doesn’t hinder Daniel’s role in the position, claiming that “Computer scientists were in charge and they did a terrible job, being lost in the weeds and largely clueless about policy. You need someone with a strategic point of view and policy skill to make progress.”

Every time I read something like this, I get extremely upset. This theory mostly assumes that there are only two types of persons in any organization, the leaders who can handle any kind of situation and the specialists whose only responsibility is to execute the master plan. This is great for many executives, because unless they are found to be personally responsible for a major screwup, it shields them from any accountability. If something goes wrong, it is never because the plan was flawed in the first place, it is due to poor execution, which can almost always be blamed on managers who are lower on the organizational chart. The problem of course, is that this is simply not true. Most issues in a company, specially in high tech, can be directly traced to a lack of a clear vision that can be communicated to the employees for proper execution. Execs who do not understand their product or market in detail are unable to produce a winning growth strategy, it is that simple. In this context, former GE CEO Jack Welsh is often mentioned as an example of a leader who didn’t need to be an expert in washing machines to turn around a very complex, diversified company. However, there are few Jack Welshes in the world and it is easy to find many examples of successful leaders who were experts in their markets, specially if we only consider fast growing markets, like cyber security. My personal opinion is that Jack Welsh, who undeniably achieved great success at GE, is now used as an example by mediocre executives to try to justify why not knowing anything in their respective fields is not a problem, and this is simply wrong.

MBA programs from prestigious universities are in large part to blame for propagating this idea that lack of experience is not a problem. Business professors usually tell their students from the beginning that they are destined for greatness and that they will learn how to make decisions by learning from the experience of great company leaders. However, unlike what many execs seem to believe, leadership is not about taking decisions by choosing one of the options presented to you by your team. It is about setting a direction and executing on a plan that you have designed. That requires both experience and guts. I don’t know about Daniel’s guts but he clearly lacks experience in cyber security, a skill that is extremely hard to acquire, and that will significantly hinder any attempts he makes to define a “strategic point of view”. Therefore, from my point of view, he is clearly a poor choice for the job. That doesn’t mean that his government experience is not important, it clearly is, but he should at the very least have recognized this shortcoming and explained how he planned to address it, instead of simply dismissing his critics. President Obama is accountable for having chosen Mr. Daniel for this position, but he also shares part of this responsibility. Leaders, to be successful, need to have zero tolerance for mediocrity and that includes their own. Those who accept a leadership position need to be convinced that they are a good fit for the job and that they will be able to deliver results. Integrity begins with an honest introspection exercise to find out if you are the right choice for the position. 


Has Apple really changed?

June 7th, 2014
Filed under: Apple, General, iPad, iPhone, Mac OS X, Macintosh | Huibert @ 2:37 pm

Unknown

This has been quite an exciting week. Apple has introduced over 4,000 new APIs for both OSX and iOS and most developers have been raving about how the “new” Apple led by Tim Cook has changed. They claim that the company now listens more to their customers and use the fact that iOS will support custom keyboards, allow for the use of the fingerprint reader and offer inter application communication mechanisms as proof that things have changed.

Frankly, I am not convinced that much has fundamentally changed. When Apple launched their Rip, Mix and Burn campaign in 2001, they were clearly listening to what customers wanted at the time. In order to do it properly, they had to plan for their vision, which included buying the application (SoundJam MP) that would eventually become iTunes and add the capability to easily burn CDs, which took some time, but they eventually released the product they wanted.

What happened this week was similar. Apple may have wanted to offer the possibility to install alternative keyboards for a while, but it took some time to deliver the capability in a secure form. What is so dangerous about alternative keyboards? Well, imagine that the keyboard logs all your key strokes and send them to some server in Ukraine. All your passwords, credit cards numbers would be gone. So, what is needed to make sure that supporting alternate keyboards is safe? Well, one way to do that is to avoid using the keyboard to enter credentials or credit card numbers in the first place. That is something that Apple has solved by releasing iCloud keychain in iOS 7 and by opening the use of the fingerprint reader in iOS 8. The other thing to do is to forbid internet access for the keyboard app if the user chooses to do so. That was also announced by Apple as part of iOS 8. It is likely that by the time Apple releases iOS 8, all new iOS devices will include a fingerprint reader and as a result, should be well protected against malicious keyboard apps. As you can see, opening iOS to support alternative keyboards is not something totally new that came out of nowhere, it is the result of careful planning and making sure that everything is in place before launching a new feature.

Swift, the new programming language launched by Apple at WWDC is another interesting example. This new language has been in the works for about four years now. It is a modern language with a lot of new cool features, but I would hardly call it revolutionary. What is interesting about Swift is that, as far as I know, it is the first language designed from the found up to make the use of an existing library much easier. Normally, a language is designed to solve a particular problem that other existing languages cannot handle well (multi-tasking, security, etc.). However, Swift seems to be designed solely for the purpose of giving the Cocoa framework a new lease on life. By basing variable types on Cocoa objects (for example strings are NSStrings) and hiding the complexity of handling structs, Swift makes it much easier to write code for Apple platforms without impacting the huge investment made by Apple and Next on Cocoa over the last 30 years (NextStep was launched in 1989). This makes a lot of sense, because it preserves Apple’s biggest asset while giving us developers what we want. Swift is therefore in that sense evolutionary and not revolutionary. It is the result of a plan launched years ago with the adoption of the llvm compiler, and the launch of Objective-C 2.0, and if Apple is really planning on eventually moving their Macs away from Intel, applications written in Swift will make the transition transparent for application developers.

WWDC 2014 was a great event because it saw the fruition of many initiatives started by Apple years ago, not because Tim Cook just started to listen to their customers and developers but because Apple seems to be accelerating the delivery of features that result from a carefully crafted plan. The success of Apple depends on maintaining a clear long and medium term plan to deliver their vision, as they have done so far, and not on delivering a long list of short-sighted features.


WWDC 2014

May 31st, 2014
Filed under: Apple, General, iPad, iPhone, iPod, Mac OS X, Macintosh | Huibert @ 3:47 pm

Unknown

Once again, I won’t be able to attend WWDC. I am very excited though that on Monday we will be able to see what Apple has in store for us for the next few years, because I believe that this event will be more about announcing the foundation of things to come than actual products we will be able to buy in June. 

From a developer perspective, Xcode 6, iOS 8 and OS X 10.10 should include enough new functionality to keep us busy for the next few months. Support for larger iPhones will probably translate into a lot of work to prepare old apps for the official launch of the new devices. Something similar is to be expected for Mac developers who will have to deal with a flatter overall design including updated controls. I certainly hope that the changes are more than skin deep, because while appearance is important and having a uniform look and feel across Apple devices can make the user’s life much easier, when I use my Mac, it is all about what I can do with it, great looks come second.

What I would like to see announced at WWDC are improvements around iCloud, namely lower pricing and APIs for Windows, Linux and Android. Writing a cross-platform app that syncs data among devices is not very difficult, there are many scalable document based data stores than can handle this task (Cloudant comes to mind). The problem is persuading customers to pay for the service. Apple on the other hand can do that much more effectively because they already have a large customer base that use the free service or pay for iCloud once a year and get a lot of value by using the service with not one but multiple apps. The value proposition is much better. Sure, there are competing services, like Dropbox, but I like the Apple option better because I can easily assume that all Apple customers have an account.

On the hardware front, I do not have many expectations. Apple has been unable to keep hardware leaks from happening in China in the past and right now we haven’t seen enough credible information to believe a product launch is imminent. If there are any announcements it will be like last year’s MacPro, a simple preview with a launch date, to generate pent-up demand.

I have no doubt that WWDC 2014 will all be about announcing the infrastructure for things to come, namely new services that will be available only to customers with modern hardware (fingerprint reader and the M7 processor as well as future devices) which will generate a need to upgrade old devices and leave the competition in the dust for a while. Apple has had several years to build the infrastructure and plan for this moment. On Monday we will finally understand what Apple has been working on. We may not understand the full reach of these announcements until Apple launches their new devices in the fall, but it will be an exciting event. I will be spending a lot of time on the treadmill next week, watching the WWDC session videos on my Apple TV.