I somewhat recently ran into this screenshot of a rant that Linus Torvalds went on about C++ and why he thinks C is the better programming language. I usually don’t care much for the drama because I think each programming language has its purpose and there will always be developers who love them and hate them. However, what I did find interesting about this particular rant is that it not only came from a very well-respected developer who has a deep understanding of low-level languages, but also that his arguments fly in the face of conventional, modern software architecture.
What I mean by that is that he argues against object-oriented programming, class inheritance and other core principles of most modern languages. I am, of course, aware that his rant is nearly twenty years old but even then, those were the important principles in software architecture.
AI-generated image of shelves full of paper, books and floppy disks
I recently ran into a blog post on another blog titled “No Data Lasts Forever.” As the name implies, it discusses the longevity of the existence of not only digital data but also other forms such as stone, papyrus and paper.
I found the post interesting and worth writing about on my blog because it is a topic I have also given a lot of thought into myself. One of my degrees is in history, and using, archiving and preserving old documents is a large part of a historian’s job, so I do have some experience in that area. On top of that, I produce a lot of written content between all of my various blogs, have composed and recorded several music albums, and have also produced a ton of code in my free time.
All of this has driven me to think about how long it will last. The content I produce is almost exclusively digital and of course, no one really knows how long digital data can be preserved. Data corruption is always a looming specter while the largest threat is probably the availability of software and technology to access that data.
Most programs use proprietary formats to save the data they produce. This means that once those programs are no longer supported or are no longer able to be run, the potential for data loss is massive.
So how can you ensure your digital data will last as long as possible? That is a difficult question to answer as no response can be a guarantee. However, there are a few things you can do to make sure it’s at least as accessible for as long as possible.
Attempting to Preserve Digital Data
The first item to mention is to make your data as universally accessible as possible. What I mean by this is that a person should be able to open it on any platform using a wide range of programs. A great example of this is text files. It doesn’t matter if the extension is .txt, .html, .md or any other format, these files can be opened on any platform using a huge variety of programs.
Plain text is also so widespread that the likelihood of it not working in the future is minimal. There are other formats that you could probably lump into that category as well, but they may not have quite the same ultra-high chances. Examples are Microsoft Word documents (.doc/.docx), PDFs (.pdf), various image formats (.jpeg/.jpg, .png, .gif), etc. These are ubiquitous in modern computing and also openable by a plethora of different applications on a variety of platforms. The risk though is that there might be compatibility problems with, say, opening a .docx file in LibreOffice which means the original might not be perfectly preserved. Even a new version of Microsoft Word itself might not be able to open and older file properly anymore.
It also goes without saying that you need to make sure you have backups: onsite as well as offsite. Those backups will help with preservation for anyone that has access to them, but if you are creating works that you want available to the public (such as this blog), then it makes sense to also have public backups so that the content will continue to be accessible in the future.
I have done exactly that with my blogs by creating a script that exports the content using the WordPress API, converts it to Markdown (plain text) and pushes it to a public repository on GitHub. There is one repository per blog and I have the backup script running once every evening. You can read more about it in the post I wrote about.
The point I am trying to make with this though is that GitHub will most likely be around a lot longer than this blog which is self-hosted on a rented virtual server. I also intend to push the content to other git hosting services such as GitLab, but haven’t gotten around to it yet.
Of course, I am under no illusion that GitHub, GitLab or any other company is going to be around forever which leaves the question: What happens when they’ve all shut down? Well, the bad news is that if you want something to be public, you have to rely on someone to host it. That’s especially true for the long-run. Even sites like the Internet Archive whose sole purpose is to, well, archive the internet probably won’t be around for very long in the grand scheme of things.
Other Options
So, if you can’t entirely rely on someone else to host your content forever, how do you preserve it for as long as possible? Well, for one, you’ll probably have to forego having it archived publicly. You can certainly make as many copies as possible on various types of media such as CDs or DVDS, but even those decay in only a matter of decades which really isn’t very long at all if you consider the fact that we still have hieroglyphics carved into stone that are thousands of years old.
I’m afraid the only answer I can think of to that question is to just simply print it out. Paper doesn’t last forever either, but it is at least a known quantity and, if taken care of properly, can last for centuries. We still have plenty of paper documents that date back to the dark ages and beyond.
Is it realistic to print everything out? Probably not. I’m certainly not going to bother doing that with my blogs and even if I did, it would only be for the posts that I consider to be the best. I do, however, keep a daily journal which I always have a hard copy of. Some years I’ve handwritten it in a notebook while others, I’ve typed it and printed it out at the end of each year. The journal isn’t just intended for me though, but also for my son, his children, their children, etc. That’s why having a hardcopy of it is critical for me.
Conclusion
Digital data preservation is a very difficult subject and we have yet to see how long we can actually preserve it without corruption, media failures, software incompatibility and so on. We can already measure it to an extent in decades, but what about centuries or even millennia? We just can’t know that yet.
There is a lot of interest in the subject though as more and more data becomes exclusively digital. Government and corporate entities certainly have a huge interest in preserving and archiving their digital data for as long as possible which is why I think we will continue to see improvements in this area.
Well, at the beginning of February, Raymond published another post on the same Microsoft blog about comments he had received about his original post wondering whether the developers responsible for creating the Windows 95 setup had forgotten about the fact that MS-DOS could do basic GUI interfaces. The idea, of course, was that since MS-DOS could do basic GUI interfaces, the Windows 3.1 and Windows 95 parts of the setup were redundant.
Unsurprisingly, Raymond makes quite a few good points in his response:
One of the reactions to my discussion of why Windows 95 setup used three operating systems (and oh there were many) was my explanation that an MS-DOS based setup program would be text-mode. But c’mon, MS-DOS could do graphics! Are you just a bunch of morons?
Yes, MS-DOS could do graphics, in the sense that it didn’t actively prevent you from doing graphics. You were still responsible for everything yourself, though. There were no graphics primitives aside from a BIOS call to plot a single pixel. Everything else was on you, and you didn’t want to use the BIOS call to plot pixels anyway because it was slow. If you wanted any modicum of performance, you had to access the frame buffer directly.
[…]
But the setup program needs more than just pixels. It also wants dialog boxes, so you’ll have to write a window manager to sit on top of your graphics library so you can show dialog boxes with a standard GUI dialog interface, which includes keyboard support for tabbing between elements and assigning hotkeys to fields.
You’ll also have to add support for typing characters in non-alphabetic languages like Japanese. Fortunately, you have a team of folks expert in Japanese input sitting in the Tokyo office working on Windows input methods who can help you out, though the time zone difference between Tokyo and Redmond is going to slow you down.
[…]
Now take a step back and look at what you’re doing. You’re writing an operating system. (Or, if you are being uncharitable, you’re writing an MS-DOS shell.)
An operating system with exactly one application: Windows 95 Setup.
What if I told you that Microsoft already had an operating system that did all the things you are trying to do, and it’s fully debugged, with video drivers, a graphics library, a dialog manager, a scheduler, a protected mode manager, and input methods. And it has a fully staffed support team. And that operating system has withstood years of real-world usage? And Microsoft fully owns the rights to it, so you don’t have to worry about royalties or licensing fees? And it’s a well-known system that even has books written about how to program it, so it’ll be easier to hire new people to join your team, since you don’t have to spend a month teaching them how to code for your new custom Setup UI miniature operating system.
Of course, I’m not going to quote the entire thing here. I can definitely recommend you read through its tongue-in-cheek humor though. It definitely makes a compelling argument for why the Windows 95 setup team did not create a GUI for setup using only MS-DOS.
I am generally a huge fan of Apple’s interfaces. Sometimes they are oversimplified in a way that can be frustrating, but as a rule, their great interface design makes their software pleasant to use in a way other companies or entities can’t hold a candle to.
That said, not everything they do is great and some things can be downright frustrating. One of them is the different placements of the “Stop” button on the homescreen for the built-in timer and alarm functions. When they go off and your iPhone is locked, a fullscreen message will be shown that gives you the option to stop them. The designs are identical except for the fact that the “Stop” button is in opposite positions. That leads to me frequently either repeating the timer or sleeping the alarm rather than turning it off. Those are the other functions offered by the other buttons.
You can see what I mean here:
iPhone Timer Stop Button
iPhone Alarm Stop Button
It’s a small thing, but is still a glaring oversight in an otherwise very consistent UI. Of course, this is complaining at a very high level as an Apple user. If this was a Microsoft or open source project, you would just be happy that the screens look somewhat similar and work properly.
Do you have any pet peeves in any of the interface designs you use on a regular basis? If so, what are they?
A little while ago, I stumbled upon a collection of photos taken from Microsoft’s office in Albuquerque, New Mexico in 1979. At the time, Microsoft was just a small company and it certainly wasn’t clear that they would become the mega-corporation they are now. Even Bill Gates recently said, “I wouldn’t say that I felt comfortable that we were successful until about 1998 or so.”
The photos are fascinating to look at because they provide a glimpse into how it was to work at a computer company during the infancy of the computing industry as we now know it. While most people have computers on their desks, there is certainly a lot more paper than you would see on most programmers’ desks today. Also, several of them were photographed looking at binders and books rather than on their computers which would unlikely be what you would see today.
Overall, it’s interesting to see how times have changed and how thoroughly computers have changed our working environments. At the time same though, it’s interesting to see how little programmers have changed. No suits and ties to be found despite that being the standard office attire at the time!
You can find the photos as well as information about them provided by the photographer at the Web Archives.
Have you ever wondered why the Windows 95 setup seems to consist of three different types of UI? It starts with a typical text-based, MS-DOS UI, then moves onto an interface with buttons and other elements resembling those found in Windows 3.1. Then, in the final stage, you see elements that look like they belong to Windows 95.
As it turns out, that is due to the fact that setup actually runs on three different operating systems: MS-DOS, Windows 3.1 and Windows 95. This seems somewhat excessive, but Raymond Chen, a developer at Microsoft who has worked on Windows for over 30 years, wrote an explanation on one of Microsoft’s official blogs about about the reasoning behind it:
Windows 95 setup could upgrade from three starting points: MS-DOS, Windows 3.1, or Windows 95. (Yes, you could upgrade Windows 95 to Windows 95. You might do this to repair a corrupted system while preserving data.)
One option is to write three versions of Windows 95 setup: One for setting up from MS-DOS, another for setting up from Windows 3.1, and a third for setting up from Windows 95.
This was not a pleasant option because you basically did the same work three times, but implemented separately, so you have to do three times the coding.
A better option is to just write one version of Windows 95 setup and use it for all three starting points. So now you get to choose the platform on which to base your code.
[…]
In the middle is the happy medium: You can have the MS-DOS setup program install a minimal version of Windows 3.1, just barely enough to support what the 16-bit GUI setup program needs.¹ This tiny version is small enough to be copied and installed off a small number of floppy disks. Once that’s done, boot into the tiny version of Windows 3.1 and run the 16-bit GUI setup program.
For anyone interested in retro operating systems or computing history, this is a short but very interesting article about the challenges they faced making Windows 95 installable on as many computers as possible back in the early-to-mid ’90s.
I recently came across an article on Ars Technica that talks about how Google appears to be losing the fight against SEO spam. This is certainly a phenomenon I have noticed not only with Google but also with Bing and Duck Duck Go which is powered by Bing.
If you don’t know what “SEO spam” is, it’s essentially content whose only purpose is to rank high in search results so that people will click on the link and land on a specific website. While the page the user ends up on has to have at least some relevant content to their search query, it is most likely only a fairly useless lure to get them onto the website with the hope that they will click around.
It’s not just you—Google Search is getting worse. A new study from Leipzig University, Bauhaus-University Weimar, and the Center for Scalable Data Analytics and Artificial Intelligence looked at Google search quality for a year and found the company is losing the war against SEO (Search Engine Optimization) spam.
[…]
Overall, the study found that “the majority of high-ranking product reviews in the result pages of commercial search engines (SERPs) use affiliate marketing, and significant amounts are outright SEO product review spam.” Search engines occasionally update their ranking algorithms to try to combat spam, but the study found that “search engines seem to lose the cat-and-mouse game that is SEO spam” and that there are “strong correlations between search engine rankings and affiliate marketing, as well as a trend toward simplified, repetitive, and potentially AI-generated content.”
The practice frustrates me because you search for something, then end up on some random company’s blog that sells shoes but is writing articles about the latest .NET framework or something entirely irrelevant to their actual business. It is simply a means of catching people not searching for shoes and getting them on their website.
Articles like that tend to be shallow, full of repetitious writing (which search engines seem to prefer due to keyword relevancy) and usually don’t contain the information I’m looking for. It’s basically a lose-lose for everyone since I then leave the website pronto.
Microsoft’s new version of Outlook for Windows has seen its fair share of criticism and, frankly, rightfully so. It is a noticeable downgrade in many ways from the classic native application that has been with us since Outlook’s first release in 1997. The UI is more modern and it contains a bunch of new features such as better integration with iCloud and Google’s services, but as a web app rather than a native application, it is significantly slower and some really basic features are lacking.
The specific omission I wanted to mention is, frankly, shocking in 2024: the ability to add a username to an IMAP server. This is a basic, essential feature that has been around for decades and yet, the new Outlook doesn’t have it which just flabbergasts me. It may seem trivial, but it means that I can’t use it for my email because my web host requires a username that isn’t the email address. Outlook just assumes your email address is your username. Mail clients should never just assume anything because everyone’s setup and requirements are different.
Outlook for Windows doesn’t allow you to set a username for IMAP accounts
Instead, I’ve adopted Thunderbird as my email client of choice for Windows. It is a native application and, more critically, includes the extremely basic feature of being able to add a username to an IMAP server. It isn’t the prettiest of applications, even with the fairly recent design refresh, but it is functional, reliable and usable.
I have sent feedback to Microsoft using Outlook’s built-in feedback tool, but unsurprisingly, nothing has happened. There are also already two threads about it on Microsoft’s forums from 2023 (thread 1 and thread 2), but again, nothing has changed. I’m not getting my hopes up anytime soon and honestly, I’m not even sure I want to use it as my main mail client on Windows. I just like playing around and it’s always disappointing when my fun is spoiled by such an embarrassing omission.
I’m not incredibly familiar with the Apple II as I wasn’t around yet during its heyday, but I do know that for much of its life, its primary interface was a terminal. Eventually, Apple created a GUI desktop environment for it and you can certainly tell that both it and the Macintosh desktop environment shared a lot of the same UI principles. Anyone who has used a classic Mac OS desktop will feel right at home here.
LinkedIn’s dark mode settings default to “Always off”
There is a current trend I don’t understand in web design: websites that offer dark mode but bury it somewhere in their settings. As a web developer, I know it’s more work to offer, maintain and test both light and dark modes which, of course, translates to costs for companies operating websites with both modes. So why bury it?
As a user who has set my device to use dark mode, I have clearly stated my preference for it and since you already offer an automatic switch based on device settings, why not make this the default? Otherwise, it’s an absolutely horrendous user experience. I, as the user, may not even be aware that there is a dark mode option and even if I am, I have to navigate the menu system to go change it when I’ve already clearly told you through my device settings that I prefer dark mode.
Of course, not all websites are like that. This blog, for example, automatically switches based on the device settings which is really quite simple to implement.
So please respect my preference for dark mode. I have already told you I want to use it!
According to BNN Bloomberg, Google is now giving websites a choice: let us train our AI on your content or be delisted from search results.
Google now displays convenient artificial intelligence-based answers at the top of its search pages — meaning users may never click through to the websites whose data is being used to power those results. But many site owners say they can’t afford to block Google’s AI from summarizing their content.
That’s because the Google tool that sifts through web content to come up with its AI answers is the same one that keeps track of web pages for search results, according to publishers. Blocking Alphabet Inc.’s Google the way sites have blocked some of its AI competitors would also hamper a site’s ability to be discovered online.
Google’s dominance in search — which a federal court ruled last week is an illegal monopoly — is giving it a decisive advantage in the brewing AI wars, which search startups and publishers say is unfair as the industry takes shape. The dilemma is particularly acute for publishers, which face a choice between offering up their content for use by AI models that could make their sites obsolete and disappearing from Google search, a top source of traffic.
If this isn’t an abuse of power from its monopoly on search, then I clearly haven’t understood what that means. This puts publishers in a very difficult position: let Google train their AI on their content so that Google can show AI results in the search and make it unnecessary for users to click on websites to find what they’re looking for or not show up at all anymore in the search results. Either way, websites are going to lose.
News recently broke that Apple is forcing Patreon to pay its 30% App Store tax for every subscription done through its iOS app or get removed from the App Store. Essentially, this means that either subscribers or creators have to swallow the costs and I suspect most creators won’t be willing to sacrifice any of the little amount most of them earn from their creative work.
While I don’t have a Patreon account (anymore) and have never had a subscriber, I can imagine the frustration that most of these people are probably feeling. Heck, it frustrates me to no end even though I don’t rely on it for an income.
I suspect this won’t go well for Apple in the EU, as the App Store has already been designated a “gatekeeper” in the Digital Markets Act and Patreon already has the right to use or link to an alternative payment system, but elsewhere, like in the US, they are fully at Apple’s mercy.
I used to be a big Apple fan, but lately, I’ve started to lose interest in the company as it keeps acting in ways that rub me the wrong way. This is just the latest example. Unfortunately, the quality of their products is still better than most which makes switching to something else difficult for me. Windows has made major improvements over the years and I dabbled with Linux for a while too, but macOS is still my favorite OS and the quality of hardware is nearly impossible to find elsewhere.
In the smartphone and tablet market, the only real alternative is Android and since I dislike Google even more, so there is nowhere else to go.
In any case, I hope the anti-trust regulators have a heyday with this and Apple is forced to place nice.
Alex is a developer, a drummer and an amateur historian. He enjoys being on the stage in front of a large crowd, but also sitting in a room alone, programming something or reading a scary story.