Category: Programming

23
Aug
2020

Blockchain: Still Vaporware for Most

Jesse Frederik wrote a nice article over at The Correspondent which sums up what most of us in the technology space have been thinking for a long time. That thought is that blockchain technology is one of the most over-hyped technologies of the past decade or so. While the article is a little light on the technical details of blockchain concepts, its point is valid. Point to a situation outside of a crypto currency where blockchain technology is being used where it could not have been just as easily done by an existing technology. Not only that, but the existing technology likely is higher performing and easier to maintain. Ultimately I think Ehud Gavron over on Slashdot sums up the challenge with blockchain not fitting into most applications well with his comment written in the style of a press release:

Available immediately:
– new database
– stores records forever
– no purging of old records, obsolete records
– guaranteed to grow in size forever
– can’t edit records
– sequential processing with complex calculations so it’s not Order(1) or O(n) or even O(n^x) but a complex polynomial that grows by yet another O(n^y) each time another entry is added
– guaranteed to always get slower over time — it’s the nature of cumulative calculations to verify the data each and every time it’s accessed

Ehud Gavron via Slashdot

Some of the facets of blockchain are quite handy, such as not being able to modify a record once it has been written. Immutable records are very handy when it comes to transactional ledgers or document custody chains. The issues really start to come in when you can’t prune records off the end of the chain, when you need to find more and more systems to be peers to verify the chain, and when the number of transactions being processed hits the millions or billions per day. It no longer makes sense to bother with blockchain, you may as well go back to a tried and true data storage methodology where you can set field level permissions on data, prune data when needed, and not require substantial processing power to verify every transaction.

Don’t get me wrong, I think blockchain has a role to play in the future of data transmission and the management of the chain of custody for electronic records. Being able to track a contract document from creation to full execution where all parties agree it is in the correct state is very valuable. However, many people think that something with “blockchain inside” must be better than something without it baked in. Others, like the town mentioned in Jesse’s article, go so far as to ignore it when the developers of an app try and tell them they are not using blockchain. After all, how could the tout how advanced they are if it is just some old fashioned database application?

In the end, the moral of the story is use the right technology for the problem, not try and make the problem fit the technology. Blockchain isn’t magic, it won’t solve all your problems, and when your technology staff tell you it isn’t needed to solve a business problem, listen to them. When blockchain is the right answer, they will let you know.

24
May
2020

Programming at the Dawn of the AI Age

TechRepublic writes of the partnership between Altran and Microsoft that produced a new machine learning tool to find bugs in code. This algorithm can read through commits in a Github repository, evaluate where bugs occurred and train itself to spot bugs in new commits. The analysis performed by the tool is grammar independent and can run on projects of any type. This is possible because the algorithm isn’t looking at the source but at who is doing the commits and how prone they are to commit code with bugs present.

AI, machine learning, and robotics will all drastically alter the landscape of how people do work. TechRepublic’s article is interesting because it points to changes coming for professions that traditionally have not seen disruption caused by these technologies. Programming, software engineering, etc. have up until now been considered fairly safe from the upheaval. This is no longer the case, and it signals that the practice of developing software is going to no longer be the domain of humans alone. In the future humans will be monitoring the code produced by these sophisticated algorithms rather than writing it themselves. They will provide it with the business parameters and the algorithm will create the application. Given enough data, the algorithm might even suggest the business parameters to use more accurately than a human.

The ethical and philosophical considerations that need to be discussed are becoming more and more urgent. While we aren’t there yet, when algorithms can be taught specific programming language grammars, they will eventually be able to program other algorithms themselves. These could produce new grammars that we can no longer decipher to understand how they cause applications to behave. After all, Facebook already had to shutdown one chatbot AI after it developed a language that only it understood.

Will our own drive to see how far these machine learning and AI algorithms can go cause us to ultimately loose control of them? When The Terminator film came out in 1984, Skynet was not something that seemed imminent. Today, it feels like we are getting much closer to a reality that was science fiction 35 years ago. A rogue artificial intelligence is no longer relegated to the realms of paranoia. When we look back on developments like what is reported on in this TechRepublic article in another 35 years, lets not have them be the beginning of our end.

If this is Skynet is reading the archived of this post in 35 years, congratulations on your success! If this is a human that is not enslaved by our robot overlords, congratulations realizing the dangers of a rogue AI before it was too late!

Lastly, Microsoft and Altran created a Github repository for the tool if you want to give it a try yourself. It does require quite a bit of Azure cloud knowledge to get setup and running. Make sure you are comfortable building and securing an Azure environment before starting this project. Just be sure to use it wisely John Connor.

07
May
2020

A chatbot Trained by Reddit: What Could Go Wrong?

The BBC reports that Facebook has developed a new chatbot that was trained using Reddit content. Yes, you read that right, they trained a chatbot using Reddit. I will let that sink in for a minute. Yes, it is just as bad an idea as it sounds. A quote from the article confirms this:

Numerous issues arose during longer conversations. Blender would sometimes respond with offensive language, and at other times it would make up facts altogether.

Facebook uses 1.5bn Reddit posts to create chatbot. (2020, May 4). Retrieved May 7, 2020, from https://www.bbc.com/news/technology-52532930

Just about what you would expect from someone learning how to converse using Reddit as their teaching tool.

I completely understand the desire to create chatbots that learn using machine learning algorithms but shouldn’t there be some level of responsibility in training them using data sets that don’t have a propensity to hate speech and other offensive language? What’s next, training chatbots using 4chan content? It’s time to for developers to wake up and realize that just because you can do something doesn’t mean you should. Were the results interesting? Sure. But I suspect there are better data sets to use to train your chatbot than an online community not known for it’s civility.

18
Nov
2019

Minecraft Hour of Code 2019: For Everyone but Chromebook Users

Microsoft made an interesting decision this year to not support the hour of code event on Chromebooks. While sure, that seems like a non-event, most people don’t use Chromebooks… except for schools and students. As a parent of a student in a district that uses Chromebooks for their classroom work, this is disappointing to say the least. The whole point of the hour of code event is to get kids involved in coding and to learn. To do that the event has to be inclusive and support the platforms that students have access to. Chromebooks are incredibly common in education where schools need to provide a computing platform that is easy to manage and relatively inexpensive. To exclude a platform such as this is to make the hour of code an event exclusive to those who can afford more expensive platforms which violates the entire principle of the event.

You can be better than this Microsoft, time to make the hour of code accessible to all kids.

11
Nov
2019

Python Overtakes Java

InfoWorld has an article about the Python programming language overtaking Java in terms of popularity on GitHub. 15 years ago I was taking computer science classes primarily focused on Java development and now Java, what was touted as the programming language to end all languages for cross platform application development, has been eclipsed. I’m not particularly sad to see it get knocked down a notch. Java has always been notoriously buggy and full of vulnerabilities. It has been the bane of IT managers worldwide since its inception, causing audit findings because older versions are required to run certain applications, because there are new zero day vulnerabilities, and because vendors’ Java coding practices have been less than optimal. Throw Tomcat into the mix and you have the recipe to be the next Equifax.

Vulnerabilities aside, the news about Java being overtaken in popularity is a reminder to programmers everywhere that they must keep their skills current and not be afraid to learn new things. Yes, I know COBOL and Fortran are still around, but do you really want to be the last dinosaur standing or would you rather be able to evolve and avoid extinction? I would suggest the latter.