Now, when the year ends, it struck me that in this new year I’ll have my 30th anniversary for the first reporting system I made for a client. And at the same time, I have just passed the 10th anniversary for completing the course “The Data Scientist’s Toolbox” at Coursera.
“They always say time changes things, but you actually have to change them yourself” – Andy Warhol.
I guess this is true. Of course, things do change without your participation, but what matters is how you change with them.
These two events are both significant, but in different ways. Of course, there has been many other data and reporting related events both before, in between, and after these two, but right now, these stand out.
The oldest event marked the very first time I actually completed a full system to be used by ordinary users, not IT people. They were skilled in statistics and knew their data, but needed to learn about such things like the PC, Windows, and Microsoft Access – which I all taught them on a course, together with how to use the reporting system itself.
The back-end work was done by a colleague of mine, who made an automated script that extracted all the needed data from the Digital ISAM database, used by our Laboratory Information Management System. Through the primitive networks we had then, and using ODBC, I connected Access to the extracted data, and built a bunch of useful reports with aggregations and other treatments. That provided the laboratory technicians with valuable information for planning the work, as well as the information they would send back to the doctors who had ordered it.
It was valuable for them for their current needs at the time – but for me, it marked a beginning of thinking data reporting across several stages and different systems, combining the values of each into what the users experienced as a simple front-end product.


The most significant I learned from the Coursera course was the statistics language R, but the very concept of big data and an understanding of where data science had moved to, were also major gains from it.
I knew about big data before, as a concept, but I had not worked with it. And I may have heard about it before, but I didn’t really know what data science actually was – that type of study had not existed when I got my first education, and I had never worked with any people educated as Data Scientists.
Big data, 10 years ago, would fit into the 64 gigabytes of memory that R could handle. Of course, everybody knew that this was in the low end, but it was valid to talk about it as big data.
And Fast Forward to 2026
The changes around the use and reporting of data, and the analysis of it, keep going on, and there are some serious improvements made almost every year. But they do not look that significant when thinking back. 30 years ago, a bunch of people in a laboratory were happy to get some reports, and I could imagine how a similar bunch of people would be just as happy to get some similar reports today, if these reports contained what they needed.
And you can still do a lot of analysis on 64 gigabytes of data. Even though there are much bigger databases available, not all analyses need to use all the data, so that 10-year-old course is actually still useful and valid.
We tend to talk about time vs IT, and time vs data, as if the world is a new one every few years, requiring new technologies, new software, and new buzzwords. But in reality, life is mostly the same – jobs are mostly similar – and, hence, the needs are close to identical to what they have been through the times.
So, my advice to all readers: don’t be too focused on all the new tech. At least not because you believe that it is necessary, because it probably isn’t. It might be interesting to study and work with, and it is easy to get bitten by the idea of being at a cutting-edge level all the time, always using the newest. But the users don’t care much about that – they just want the data and insights they need.
May 2026 bring you prosperity, happiness and joy!

