AS the stakes get higher and the high-ups get crazier, there’s always respite in tales of friends and family. Yet even stories of beloved four year-olds and their happy playtime experiments aren’t free from the ominous low-notes of the times.

So this wee yin I know has been living with a device in her home called an Amazon EchoDot – the size of a hockey puck – inside of which lives an artificial intelligence called Alexa. You say her name clearly, and Alexa wakes up, her device’s circumference glowing blue.

“Alexa, sing ‘Twinkle, Twinkle, Little star?’” It’s done. “Alexa, can someone bring me a pizza?” What kind of pizza would you like, says a kindly American lady’s voice. “Alexa, tell me the Cinderella story”. There was, many years ago, a gentleman who had a charming lady for his wife… Now don’t worry. As soon as this charmer spots me, I’m dragged off to the doll’s house and set to work with copious piles of plastic bacon and Lego parts. A proper, non-virtual childhood is mainly happening here.

But her dad – a man of the gadget persuasion – marvels at the way his daughter swipes and flips her devices. And he is particularly stunned at the way this device has so easily become one of the domestic voices she naturally refers to. Indeed, the whole family is following her lead.

“I am a bit anxious”, says dad, “because I know it gets better and better by listening to our speech patterns all the time, constantly analysing us. Who else might be listening in? But in the meantime, it’s like science fiction come to life.”

What kind of science fiction, though? A device like Alexa sits at the crossroads of our utopian or dystopian mindsets about technology.

Take Spike Jones’s movie Her – where Theodore (Joaquin Phoenix) falls in love with his disembodied artificial assistant Samantha, smokily voiced by Scarlett Johanssen. Her tends towards the bittersweet, even spiritual version of these digital companions.

Samantha loves Theodore too, but she also loves hundreds of her other users (such is her infinite processing capacity). At the end of the movie, she leaves him to ascend to another level of unimaginable digital existence, leaving behind his piddling human concerns.

The downside perspective on these devices is, all too obviously, 1984. Remember the Telescreen installed in every worker’s home, ensuring total control of the lives of the population?

We have heard enough from Snowden and Manning, of GCHQ and the CIA, about their freewheeling access to our digital interactions, to resonate with Orwell’s vision. Stunningly, most of this super-surveillance is now official law in the UK, under the Investigatory Powers Act. This was recently described by Amnesty International as “among the most draconian” mass surveillance laws in the EU.

So at least if you switch off the laptop, or lift your fingers from the smartphone, you give the powers-that-be no data to track. But a device that sits listening to everything, helpfully serving you, nestling deep in your hearth. Could there be an easier entry-point for ambitious techno-totalitarians?

The softeners are already being offered. Health providers say they could use the timbre of your voice to detect health issues, before you even know you have them.

A dreadful device called ElliQ sets itself up as an “active ageing companion”. This glowing lozenge urges elders to take their pills, not miss their swimmercise classes, etc… all under the benign android glow of the device’s eyes and ears.

There does seem to be a direct clash of powerful interests here. Rosy-glow marketing trumpets these net devices as our new domestic utilities, like fridges or washing machines. Info-capitalists are desperate to find new markets for digital services.

Yet we know that right-leaning governments are forcing these smart devices, by the power of law, to have the capacity to watch, hear and read you at all times. How can a consumer “trust” an info-brand, when the state behind it is going rogue with your privacy and liberties?

“Take back control” is the most supple and usable of phrases. One might easily imagine it pressed into service to defend our digital civil liberties, against this matrix of micro-control.

However, as it happens (and with mirthless irony) it’s the Trumpsters and the Brexiteers who turn out to have been the most purposeful users of surveillance tech – by micro-targeting voters’ data-behaviour, in order to amass them as a winning majority.

Let me introduce you to your latest, favourite, moustache-twirling evil corporation: Cambridge Analytica. Hired by the Leave campaign first, then the Trump campaign (and paid $5 million dollars by them in the last month of campaigning), this British organisation claims to have been able to “profile the personality of every adult in the United States of America – 220 million people”, says its founder Alexander Nix.

The identification isn’t socio-economic, but psychological – noting peoples’ online likes and preferences, and matching them to what psychologists call the ‘Big Five’ personality traits: openness (how open are you to new experiences?); conscientiousness (how much of a perfectionist are you?); extroversion (how sociable are you?); agreeableness (how considerate and cooperative are you?); and neuroticism (how sensitive/vulnerable are you?).

In a 2016 video where he strides about the stage like a cut-rate posh-boy Bond villain, Alexander Nix talks about his method. You buy available info – “what car you drive, what products you purchase in shops, what magazines you read, what clubs you belong to”, as well as your social media “likes”. And then you use that to construct your voter types.

A “Neurotic and Conscientious” type, says Nix, won’t take the same message about, say, gun laws (or immigration) as a “Closed and Agreeable” type. Fifty thousand variable messages can be produced to hit these psychometric types right at their emotional soft-spots.

A rather haunted academic, Michal Kosinski, made the breakthroughs that led to this process: “I just showed that the bomb was there”. But Cambridge Analytica – who already have Steve Bannon, Trump’s alt-right svengali on their board – are being lined up to serve the post-Trump era. Their job will be to continue to map the psychometric trigger-points of his supporters.

This is sinister, powerful stuff: a combination of ‘soft’ and ‘hard’ knowledge that looks like it can win plebiscites, against the evidence of more traditional polling methods. Dominic Cummings, the Mekon of Brexit, has recently stated that “if you want to make big improvements in communication, my advice is – hire physicists.” That is, those who can crunch human data into patterns that can be campaigned on.

So how do we deal with all this? Like Ford Kiernan’s angry man in Chewin’ The Fat, do we stomp around the house ripping out data points, snipping smart cards and smashing modems? Mibbes naw. We may need to develop a better info-politics than one halfway between King Canute and General Ludd.

Perhaps we should start to argue that all these agencies, state, commercial and political, are profiting from data that we have generated ourselves.

How do we assert our ownership, as a citizenry, over these data assets? What structures can we build which could flip the relationship around – to one that presumes that our data is first and foremost ours, which companies and agencies can only request access to?

We need a moment of political opportunity, and some bold reforming spirits, to be able to imagine and then build such structures. I can see the bold Joanna Cherry QC, along with the data campaigner Alastair Robertson, taking on exactly this topic in a post-indy Scotland.

Tech critic Rebecca MacKinnon notes that we believe that digital networks are “the air that we breathe and the water that we drink”. But she says: “Programmers and executives and editors and designers, they make this landscape. They are human beings and they all make choices.”

So maybe we need to look at these clever devices and ask questions about what they’re habituating us to.

Sorry, wee yin. But Uncle Pat may be getting you a T-shirt emblazoned with these words for your birthday: Program – Or Be Programmed? You can try Alexa with it. But it may be a question she is, by definition, unable to answer.