Privacy is dear to many people’s hearts, especially online. Recently, this has been brought into sharp relief with Google’s circumvention of users’ browser settings and the much trumpeted change to their privacy policy. Unsurprisingly it’s generated masses and masses of comment and advice.

For me, the current argument has highlighted a fundamental challenge for anyone developing apps and sites: how do you retain users’ trust? And while many developers will simply take the view that their job is to create the best app/site/experience they can, the big question remains: for who?

This is important, not just for UX reasons, but for more ethical ones too. Because as soon as ‘who’ stops being users and starts being advertisers or shareholders, the temptation to ride roughshod over people’s privacy mounts.

In reality, no matter how many safeguards are put in at the browser level, some clever developer will find a way round them. They’ll serve up fresh cookies from local storage, or embed them in every facet of a site, or find some other ingenious way of getting round the ‘problem’ of privacy.

Of course, privacy is not a ‘problem’ for users (except when it is violated). By choosing the settings they’ve selected, users make a clear and unambiguous choice about their desired levels of privacy. Once a developer ignores that choice and is found out, they break the user’s trust. Then we instantly move from people seeing the relationship in give-and-take terms (where they will help to evangelise and improve our sites and apps) to one where they see it as purely take-and-take (where they’ll rightly dump all over us in social media and leave).

Ultimately, a breakdown in trust is not in anyone’s best interests. And that’s something all of us (not just those inside the Googleplex) need to keep firmly front and centre.