Artificial intelligence and surveys: crucial questions about the new technologies affecting our lives

The pervasive use of voter surveys is a convoluted affair, but a new layer of complexity that seems to have gone unnoticed was added last Sunday, when the Times of Malta published the results of a survey that used machine learning to fill the gaps in the survey data.

The exercise ultimately predicted that if a general election were to be held now, the Labour Party would win by more than 50,000 votes, securing 57% of the electorate’s support.

Problematic surveys

Surveys, especially those that deal with opinions on complex issues, continue to have serious limitations, not least because we human beings are complicated and so is the way we form our opinions.

The context of the phone interview, or online survey, doesn’t resemble the real-life circumstances in which we form our opinions on issues, and often, respondents are under pressure to have definite views about something of which they may be uncertain, conflicted, or even uninformed.

Then there’s the matter of the methodology, where there is no shortage of research about the way election surveys are designed, administered, and analysed, more so following the abysmal prediction results during the 2016 and 2020 presidential elections in the USA.

After the mixed results in polls around the 2020 presidential elections, technology experts believe that artificial intelligence (AI), used increasingly by companies to assess customer sentiment, could hold promise for better understanding the electorate.

And yet, even within those companies that did use AI to predict the outcome of the 2020 US elections, the results were mixed.

AI on top of that

Artificial Intelligence (AI) comes in many stripes, but the kind that has generated the most enthusiasm in recent years is the subfield of “machine learning”. Broadly speaking machine learning looks for patterns in massive amounts of data and uses existing information to generate new information. One of the most prevalent uses of machine learning algorithms is in prediction.

The new information is then used in numerous ways, from powering our streaming and online shopping recommendations to predicting disease risks but there are also the more questionable uses, including surveillance and manipulation.

The promise of AI is that, overall, algorithms can make many routine predictive exercises more precise and less prone to idiosyncratic distortion – and consequently fairer – than human intuition. This has the enormous potential to do good, but the technology also has some limitations and issues that need to be considered.

One of the problems is bias, one of the core issues AI wants to eliminate. However, technologies are not neutral and are only as good or as bad as the people who develop them.

If the automated systems used increasingly by governments fail to incorporate the principles real humans use to make fair decisions, then you get situations where, for example, students in the UK had results downgraded despite having strong academic records or results that were based on their school’s past performance not their own.

Or you get software used to predict future criminals that show bias against black people. Add to this the numerous ethical and legal conundrums that come attached with the technology’s progress and increased use.

Curious citizens

Many governments are increasingly adopting artificial intelligence. The EU alone documents more than 290 AI policy initiatives in individual EU member states between 2016 and 2020. The government of Malta published its national AI strategy in 2019.

When AI was used to predict how people might vote in Malta, it was rather disheartening to see that there were over 400 comments about the results of the survey but just a handful about the methodology used, especially since details about the methodology provided by the follow-up article were sparse.

If we’re about to see more of this technology used as envisaged by the government’s national strategy, what do we as citizens need to know about this technology?

We can’t expect to become experts, but should we not be a little bit curious as to how is it likely to affect us as customers, homeowners, students, educators, patients, prison inmates, members of minorities, and voters in liberal democracies?

                           

Sign up to our newsletter

Stay in the know

Get special updates directly in your inbox
Don't worry we do not spam
                           
                               
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments

Related Stories

Opinion: salami slicing
There isn’t a single moment when a particular administration
Police handed internal TM investigation on RHIBs scandal – no charges issued yet
Transport Malta has confirmed that it has passed the

Our Awards and Media Partners

Award logo Award logo Award logo