The Dressler Blog

When you build digital stuff all day, you develop opinions. Lots of opinions.


Digital Trends

Ransomware: Good business or great business? Last week, many owners of out of date and badly maintained PC’s suffered a crippling blow when all of their files were suddenly encrypted by the infamous WannaCry ransomware. Ordered to pay $300 in bitcoins to get their computers unencrypted, many victims chose to simply pay to avoid further inconvenience. No one knows exactly how much money the cyber criminals who distributed WannaCry have made, but estimates range as high as $1billion. This is an absurd amount of profit considering that the perpetrators didn’t even discover the vulnerability they exploited. (N.B. I have been told be a well-informed source it wasn't nearly this much money, but still…) This was the work of our fearless protectors at the National Security Agency who not only stockpiled vulnerabilities in Windows to exploit, but then committed the error of having their stockpile of hacks leaked to Wikileaks. Meaning that someone made lots and lots of money by repurposing code paid for by American taxpayers. Why does this matter? If ransomware were not illegal, WannaCry would have all the makings of a tech unicorn. Not only did they quickly make loads of money based on ubiquitous adoption and low licensing fees (okay, ransom demands) but they are making highly effective use of Bitcoin, one of the hottest technologies going. We can fault them for the fact that their tech was actually written by someone else, but Microsoft did the same thing back in the day. And honestly, it’s about time someone made money from the hidebound companies and bureaucracies that have been using archaic and vulnerable versions of Windows for years. Old versions of Windows are every developer’s pet peeve because old operating systems feature old browsers and old browsers don’t render properly. In a nutshell: Upgrade already, you cheapskates. Read More Syntax, Structure, & Translation “In the beginning God created the heaven and the earth. And the earth was without form, and void; and darkness was upon the face of the deep.” There are as many as 7,000 languages on earth and those two sentences have been translated into the majority of them. Google Translate, by far the most common of the online translation engines could translate that sentence into about 100 languages. But the existence of translations of the Bible into thousands of minor and almost-forgotten languages has long been a boon to linguists. Now, a new generation of linguists are using machine learning and the Bible to attempt to understand the underlying grammatical similarities in languages in order to understand the relationships between languages, the origin of languages, and perhaps someday create a universal translator. Professors at Ludwig-Maximilian University of Munich in Germany have been using common grammatical structures (like the past tense) and then analyzing how those vary across thousands of different languages using machine learning algorithms. Why does this matter? Linguistics is the next great problem for technology. As we rush to abandon the keyboard and embrace voice inputs, we have an unjustifiable faith in the consistency and rationality of language. But linguists claim that language changes radically over time. Semantic drift is an ongoing process and voice technology needs to adapt to changing language at least as fast as humans do. The sheer variety of languages also presents a problem. Old languages die and new creoles are born all the time. The speakers of these languages will require technology. Tying voice input to a single language is absurd. It necessitates rebuilding the input mechanism for each additional language. It makes more sense to have no specific language in the processing layer with a universal translator handling inputs. If that universal translator is built with a solid linguistic understanding and machine learning it just might keep up with the rapid changes in human languages. In a nutshell: Interested in voice input? Hire a linguist. Read More Fail Week at Snap Pity the mighty unicorns of 2014. One by one, they crash to earth in a sticky and malodorous heap of sexual harassment lawsuits and missed earnings. Could it be that outside of the golden sunlight of Silicon Valley and the forgiving embrace of venture capitalists these world-beating disruptors look a little less glorious? Last week the bloom came off the proverbial rose for social media powerhouse Snap, Inc. Snap completely whiffed on earnings. Snap executives were quick to point out that they missed earnings due to payouts to employees associated with their initial public offering. But analysts also noticed that user growth seems to be slowing, potentially an indicator that Snap is not so much the next Facebook as the next Twitter. Being “the next Twitter” is not considered a compliment in these circles. Why does this matter? First, it’s time to be a little more real about Snapchat. They compete head-to-head against Facebook and I can guarantee you that Zuckerberg and company have no intention of losing. Second, Snap doesn’t have a fiercely competitive business prodigy like Zuckerberg running the show, nor do they have the wisdom to bring on an experienced hand like Google did with Eric Schmidt. They are making silly mistakes, like not providing guidance to technology stock analysts about earnings and growth. Finally, soft user growth is a very bad sign for such a young social network. The principle of “network effects” is a cruel mistress. If you are growing, you are growing obscenely. If you aren’t, you’re dead. In a nutshell: Snap may not be the company we thought it was. Read More Quantum computing: the now and the soon The first microchip was just a couple transistors wired together. Over time, driven by the omnivorous needs of the technology industry, the number of transistors on a chip increased until today each tiny microchip contains billions of transistors, each one about 10 nanometers wide (10 nanometers = 100 atoms). Moore’s Law, which postulates that computing power will double every 18 months has not been driven by reductions in the size of transistors alone. Forced up against the physical limits of transistor size, chip designers have continued to realize Moore’s Law by having multiple chips run in parallel and by specializing chips for unique purposes. But there is another law in computing, Amdahl’s Law that suggests there is a limit to parallel processing because certain computations must be done sequentially. Ultimately, limitations rooted in the binary logic of classical computers might put a stop to Moore’s Law were it not for the deus ex machina of quantum computing. Quantum computing allows us to encode information into quantum states that we can control and quantum states encompass all of the variables of real numbers. Meaning that information is not reduced to zero or one. Each additional quantum circuit or qbit added to a quantum system effectively doubles the performance of that system. Rudimentary quantum devices, using this exponential increase in computing power are already able to perform computations that the world’s most advanced supercomputer could never solve. Why does this matter? Quantum computing is going to hit technology like a tsunami. Software engineers who have spent their careers building software for classical computers are suddenly going to find their skills don’t apply to these new systems. When people’s self-interest is involved, one can anticipate that there will be massive resistance to the integration of quantum computing into the existing technology industry. Usually, disruptive technologies require a “killer app” to overcome resistance. But Chad Rigetti of quantum computing startup Rigetti Computing, suggests that the killer app might be something as mild as a quantum layer of optimization on existing machine learning systems. Rigetti’s company is focused on getting quantum computers and classical computers to work together seamlessly – each performing those tasks that they excel at. Quantum computing may be a once-in-a-generation revolution in technology, but it looks like it will sneak into the enterprise in sheep’s clothing. In a nutshell: Learn this now. Before your technological skills look like a quaint anachronism. Read More

Give us your email to sign up for our weekly Dressler Digital Trends. Stop trying to keep up and start getting ahead.