Stupid AI, Scary Stock Market Algorithms and Dodgy Session Replay Scripts

Are you worried Artificial Intelligence is on track to become a bit too intelligent for comfort? Do you feel comfortable that today’s stock markets are run by algorithms, with little or no human involvement? Is your data being nicked by session replay software, then sold to third parties? If you’re keen to keep your private life private, your investments safe and your personal and behavioural data secure, we live in particularly ‘interesting’ times.    

Alexa, Echo and co – Why today’s AI isn’t the least bit scary

There’s a moral panic underway. People are genuinely worried that artificial intelligences will take over the world in some awful, apocalyptic way. But will gadgets like Amazon’s Echo and Google Home ever turn on us and eliminate their human masters?
It’s highly unlikely for many reasons, not least because AI isn’t anywhere near as intelligent as humans, and many scientists think it never will be. After all, we still have no idea exactly what ‘consciousness’ is, never mind being able to replicate it. But the biggest reason of all is that these gadgets have one core purpose beyond supposedly making our lives a little bit easier. And that purpose is to collect our personal data, so their commercial partners can sell stuff to us. If Alexa killed us off it wouldn’t have anyone to steal data from, nobody to sell to, and no reason for existing.

Stock market algorithms put the world’s finances at risk

In 2008 the western world’s financial system went belly-up in the most spectacular way. The system still hasn’t righted itself, the big banks seem to be behaving in the same old way, and algorithms have taken over the stock market. In a landscape where code does the trading, and does it in tiny fractions of a second, do we still need a human-involved stock market?
Roll back time to when people, not algorithms, did the short-term trading. Instability was kept under control thanks to humans having their own ideas about what to buy and sell, and anything on sale for less than the normal value was snapped up. Now, with a small number of algorithms in control of the money markets and humans taking a big step back, the potential for instability is much greater.

Session replays are sending your personal data to third parties

You type a query into Google, but you abandon it part-way through. A new piece of research reveals that almost 500 websites are busy collecting your behavioural information under these circumstances, using session replay software.
Session replay tools log everything you do on a website, even the stuff you do before you press ‘enter’, then they send the data to the commercial partner that placed the script on the site in the first place. And these scripts bypass all sorts of privacy measures, including the much-vaunted https.
While session replay scripts have been around for years, created to help developers identify how visitors behave on websites, the pressure to monetise means they’re becoming more widely used as well as specially designed to capture more data without permission, no matter how private that data is. And that includes passwords.
The research, carried out by Princeton University, USA, showed how the sites containing these scripts didn’t always know the software was there. The massive US pharmacy brand Wallgreens, for example, claimed complete ignorance of a session replay script found on their website. As a result they’ve stopped sharing data with the script’s owners, Fullstory, which thankfully insists it doesn’t sell the data it collects. But not every company can say the same.
How to keep your data private? You could try to only use websites you trust, but the notion of trust in this context is fraught with difficulty. You’d have to read through the website’s Terms and Conditions and Privacy Policy to be sure. On the bright side the EU GDPR, due to become law on May 2018, should help prevent this kind of disproportionate collection and misuse of our data.

Leave a Reply

Your email address will not be published.