“A user’s experience of a business and its services will only be as pleasant as the business is trustworthy. Treat visitors with respect and remove barriers to access (such as multiple data requests and spam), and you’ll improve usability — and empower your audience in the process.”
Posts in category 'Identity'
“At the center of the broader societal debate is Boyd, whose views on key issues like online privacy are followed closely by tech companies and policy makers. An opponent of “regulation for its own sake,’’ as she puts it, Boyd, 32, has become a go-to source for companies (from Google on down), government agencies, and academics seeking insight into youthful behavior in a 24/7 digital universe.
She prides herself on diving deeply into what young people think and feel about their use of social media. With her tongue stud, bracelets, and neobohemian style of dressing, she fits in seamlessly with her target demographic, even while joking that they all “think I’m an old lady.’’”
“A century ago, when the first home phones were “party lines” shared by neighbors, “worrying you were being listened in on was a common feature of American culture,” says sociologist Claude Fischer of the University of California-Berkeley.
Oh, how times have changed.
Now, we’re not only unconcerned about overheard phone calls, we purposely broadcast our personal business to large groups of “friends” and “followers” on social networks such as Facebook and Twitter.
As a result, we’re fast becoming a nation of casual eavesdroppers, where every day we tune in to a constant stream of updates on what others are saying and doing, from where they’re about to eat lunch (complete with photos) to their conversations with others.
All this sharing, some experts say, may be feeding a tendency toward exhibitionism, and devaluing the very privacy that earlier generations so desired.
But not everyone says the rise of widespread social snooping is such a bad thing.”
“Each time Facebook’s privacy settings change or a technology makes personal information available to new audiences, people scream foul. Each time, their cries seem to fall on deaf ears.
The reason for this disconnect is that in a computational world, privacy is often implemented through access control. Yet privacy is not simply about controlling access. It’s about understanding a social context, having a sense of how our information is passed around by others, and sharing accordingly. As social media mature, we must rethink how we encode privacy into our systems.”
“The many privacy related issues raised by the Web will be amplified in the world of mobility and even more so, in a world dominated by sensor networks. Current thinking seems to converge on one important conclusion: through the combined interaction of law, technology and Internet literacy, people should be in a position to control how their own personal information is made available and used for commercial (or other) purposes.
In this post, we explore the feasibility of users managing their own data .. i.e. if we indeed want users to manage their own data, what are the issues involved in making this happen? We also look at an alternative i.e. allowing devices to mirror social privacy norms. Hence, I see the discussion as ‘Changing user behaviour to incorporate new device functionality’ OR ‘Changing device behaviour to mirror privacy expectations in human interactions’”
“Internet companies have appropriated the real estate business’s mantra — it’s all about location, location, location.
But while a home on the beach will always be an easy sell, it may be more difficult to persuade people to start using location-based Web services.” [...]
“For now, many people say sharing their physical location crosses a line, even if they freely share other information on the Web.”
“Experts say that the huge growth of the internet has in effect created a “permanent memory” online that can be searched by anyone. Embarrassing statements, and photographs, or angry attacks by spiteful ex-friends once faded away. But no longer. [...]
There are now many firms offering help in keeping people’s online history safe. They include companies and websites like Online Reputation Manager, Reputation Professor, Reputation Defender and Reputation Management Partners.”
At the yearly Premsela Lecture, a speaker from outside the world of design addresses current developments in the field. The yearly lectures are organised by Premsela, Dutch Platform for Design and Fashion.
“”Dress, clothes and fashion are rare topics in the social sciences,” Etcoff said, “particularly the branch I inhabit, at the intersection of neuroscience and psychology. Perhaps that is because historically, there has been far more interest in reason and the mind than in emotion and the body, in depth rather than surface, [although] dress has as much to do with reason as emotion, as much to do with the mind as the body, and as much to do with our inner depths as our surface.”
She outlined the variety and importance of our reasons for adornment, ending with a call to designers to use science to push fashion further in its enhancement of human well-being.”
Etcoff, author of The Survival of the Prettiest, The Science of Beauty, is a faculty member at Harvard Medical School and a practicing psychologist at Massachusetts General Hospital.
Martijn de Waal contributed to this issue with the article: “New Use of Cellular Networks – The Necessity of Recognizing the Nuances of Privacy”.
According to media researcher Martijn de Waal, it is time to rethink our ideas of privacy. The growing use of cellular networks is generating data that plays an important role in civil society projects. To be able to continue using such data in a meaningful and fair way, people must become aware of the fact that privacy is not only a question of either private or public, but includes many New gradations in between.
Some other articles are also available online.
“Textual and technical illiteracy is often cited as a barrier to the adoption of services and by default the benchmark for success is often set at ‘understanding and completing the task by oneself’. However if there are ‘literate’ people nearby to what extent does it matter that the user is illiterate?
‘Mediated use’ is simply recognising that part or all of a task or process is mediated through others.
Facebook and “radical transparency” (a rant) (14 May)
The battle that is underway is not a battle over the future of privacy and publicity. It’s a battle over choice and informed consent. It’s unfolding because people are being duped, tricked, coerced, and confused into doing things where they don’t understand the consequences. Facebook keeps saying that it gives users choices, but that is completely unfair. It gives users the illusion of choice and hides the details away from them “for their own good.”
Facebook is a utility; utilities get regulated (15 May)
What’s next is how this emergent utility gets regulated. Cuz sadly, I doubt that anything else is going to stop them in their tracks. And I think that regulators know that.
“The conventional wisdom suggests that everyone under 30 is comfortable revealing every facet of their lives online, from their favorite pizza to most frequent sexual partners. But many members of the tell-all generation are rethinking what it means to live out loud.
While participation in social networks is still strong, a survey released last month by the University of California, Berkeley, found that more than half the young adults questioned had become more concerned about privacy than they were five years ago — mirroring the number of people their parent’s age or older with that worry.
They are more diligent than older adults, however, in trying to protect themselves. In a new study to be released this month, the Pew Internet Project has found that people in their 20s exert more control over their digital reputations than older adults, more vigorously deleting unwanted posts and limiting information about themselves. “
Interestingly “mistrust of the intentions of social sites appears to be pervasive.”
This new report by the UK think tank Demos is an up-close and personal investigation into how people feel about the use of their personal information. The British public might not be as reserved as we like to think.
The database society is not inherently good or bad. The best we can hope for is that it is as democratic as any of the institutions, markets, and regulatory and legal systems that exert power over our lives. The rules governing information use will determine our power as individuals in the database society and the powers that the state, businesses and other people have over us. As the infrastructure of the database society passes through a formative stage, it is important to understand more about the use of personal information is understood by the people it affects.
Democratising personal information does not only mean giving people a voice in the debate. It means finding better ways of listening to what they say. This pamphlet is about what people think about the use of their personal information. It sets out the findings of Demos’ ‘People’s Inquiry into Personal Information’, revealing the opinions and ideas expressed over 13 hours of deliberation. The inquiry demonstrates how to engage in the conversations that bring personal information decision-making closer to the people it affects.
“Americans tend to be less concerned than Europeans. Privacy, after all, is not a clear constitutional right whereas freedom of speech is. Freedom of speech is actually the first article in the U.S. Bill of Rights. It’s not that Americans don’t value privacy, but they often view it as a tool to prevent government from overstepping its authority. This represents a fundamental difference in the way Americans and Europeans react to privacy issues.
In Europe, privacy is considered a basic human right. Article 8 of the European Convention on Human Rights spells it out, “Everyone has the right to respect for his private and family life, his home and his correspondence.” To put things in perspective, freedom of speech first comes in Article 10.”
Here are a few of the reviews, from which I have distilled some telling quotes:
“Boyd says that privacy is not dead, but that a big part of our notion of privacy relates to maintaining control over our content, and that when we don’t have control, we feel that our privacy has been violated. This has happened a few times recently. [...]
To help underscore her points, she recalled and discussed a number of major privacy blunders from Facebook and Google. [...]
Boyd then transitioned to talk a bit about the fuzzy lines between what is public and private. She says that just because people put material in public places doesn’t mean it was meant to be aggregated. And just because something is publicly accessible doesn’t mean people want it to be publicized.”
“For Boyd, her years of research have been eye-opening into the divergence between what users want–and their emergent behavior–and the ways tech companies interpret those desires. “Often,” she said, “companies trying to build efficiencies into their systems profoundly misunderstand what they’re trying to be efficient about.” [...]
“There’s a big difference between publicly available data and publicized data,” she said, “and I worry about this publication process, and who will be caught in the crossfire.”
“We are going to see a continued emergence of new tools that complicate the boundaries between the public and the private, and technology will continue to make a mess of it.”
“Ultimately, then, for the people who build these systems,” Boyd said, “it is imperative that they ask questions about what people really want and what people want to achieve.”
“For marketers, it’s essential to remember that the accessibility of people’s information online doesn’t necessarily indicate that they want to be seen by you. Just because you can interpret people,” Boyd said, “doesn’t mean you’re going to get it right. Just because you see something doesn’t mean you know what’s going on.”
And to the systems designers on hand for her keynote, Boyd had one final message: “As designers, you need to think through the implications and ethics of what you’re doing,” she said. “You are shaping the future. How you handle those challenges will shape the future.”
“Last week’s ruling from an Italian court that Google executives had violated Italian privacy law by allowing users to post a video on one of its services [...] called attention to the profound European commitment to privacy, one that threatens the American conception of free expression and could restrict the flow of information on the Internet to everyone. [...]
“The framework in Europe is of privacy as a human-dignity right,” said Nicole Wong, a lawyer with [Google]. “As enforced in the U.S., it’s a consumer-protection right.” [...]
Article 8 of the European Convention on Human Rights says, “Everyone has the right to respect for his private and family life, his home and his correspondence.” The First Amendment’s distant cousin comes later, in Article 10.
Americans like privacy, too, but they think about it in a different way, as an aspect of liberty and a protection against government overreaching, particularly into the home. Continental privacy protections, by contrast, focus on protecting people from having their lives exposed to public view, especially in the mass media.”
Excerpts from this post (translated into English):
“The Italian sentence on Google says fundamentally that the judges do not consider the [YouTube] platform to be an editor (Google was not sentenced for defamation) but they consider it responsible when there are violations of privacy legislation, in particular with regards to the sharing of sensitive data related to a person’s health. It might be that the problem that could simply be resolved by adding a button to the platform, so that the user, when about to publish something, has to declare that the uploaded contents are not in violation of the privacy legislation. We shall see. [...]
One cannot ignore the fact that the motivations for the ruling are currently lacking. Once the judge will publish them, it will become obvious whether he did indeed take all this correctly into account, pointing out simply that in Google’s terms and conditions at the time, not all precautions were taken to avoid that users would upload materials that damages privacy – in which case the whole thing would be a lot less worrisome and platforms, in order to comply with the law, would just need to be more clear in asking users to pay attention to privacy matters.”
A second post provides some further reflection:
“The right to freedom of information and the right to privacy are increasingly in conflict. And all those who want to reduce the first can appeal to the second. [...]
And even if it all leads to the fact that the platform needs to ensure that those who publish contents have all the rights to do so, even by asking first third parties before going on to publication, all this will generate enormous complications for any platform that deals with user-generated content. If it is just a matter of a better description of the terms and conditions, then it could be resolved rather easily.”
Google Video: Italian law is complicating the world
“So now those platforms that allows users to publish online content have become responsible for possible violations by those same users? That’s what an Italian judge just decided. And this will have global legal consequences.
Judge Oscar Magi – the same one [who dealt with the CIA kidnapping] of Abu Omar – has condemned several
Google Italy executives for violating Italian privacy law, because they allowed the publication of a video showing a teenager with Down’s Syndrome being bullied. The judge absolved the three of a defamation accusation.
In practice it seems to state that Google would have had to obtain obtain a consent of all the parties involved – directly or indirectly – to the publication of these images.
This lower court decision is not final [and can be appealed]. But it opens a very complicated future scenario for all internet access providers and most of all for platforms that allow informational and other video content to be published by users directly.
Taken to its logical consequence, this sentence means that before publishing anything whatsoever about third parties on Twitter, Flickr, YouTube, or Facebook, users need to first obtain a consent from those third parties, and if not, also the platforms themselves are responsible. The platforms therefore need to supervise everything their users are publishing.
That could be a very serious blow to the world of user-generated content. This sentence should be carefully looked at by all those people and entities who care about the web as a place for freedom of information – with all its good and bad, its risks and opportunities.”
In fact, according to the BBC, Google’s lawyer “questioned how many internet platforms would be able to continue if the decision held.”
In any case, here is Google’s answer. And yes, they are going to appeal.
“The web-based survey gathered opinions from prominent scientists, business leaders, consultants, writers and technology developers. It is the fourth in a series of Internet expert studies conducted by the Imagining the Internet Center at Elon University and the Pew Research Center’s Internet & American Life Project. In this report, we cover experts’ thoughts on the following issues:
- Will Google make us stupid?
- Will the internet enhance or detract from reading, writing, and rendering of knowledge?
- Is the next wave of innovation in technology, gadgets, and applications pretty clear now, or will the most interesting developments between now and 2020 come “out of the blue”?
- Will the end-to-end principle of the internet still prevail in 10 years, or will there be more control of access to information?
- Will it be possible to be anonymous online or not by the end of the decade?
Fast Company focuses on privacy:
Experts were nearly split down the middle, with 55% agreeing that Internet users will be able to communicate anonymously and 41% agreeing that, by 2002, “anonymous online activity is sharply curtailed.” Not only are there divergent opinions on whether online anonymity will be possible in the future, there isn’t even a consensus on whether anonymity is universally desirable.
ReadWriteWeb takes a broader view and highlights some key quotes from the report.
MSNBC instead focuses on literacy:
A decade from now, Google won’t make us “stupid,” the Internet may make us more literate in a different kind of way and efforts to protect individual anonymity will be even more difficult to achieve, according to many of the experts surveyed for a look at “The Future of the Internet” in 2020.
“Our privacy faces new challenges: behavioural advertising can use your internet history to better market products; social networking sites used by 41.7 million Europeans allow personal information like photos to be seen by others; and the 6 billion smart chips used today can trace your movements.
The European Commission today – Data Protection Day – warned that data protection rules must be updated to keep abreast of technological change to ensure the right to privacy, legal certainty for industry, and the take-up of new technologies. EU rules say that a person’s information can only be used on legitimate grounds, with their prior consent.
With the Lisbon Treaty and the Charter of Fundamental Rights now in force, the Commission today said it wants to create a clear, modern set of rules for the whole EU guaranteeing a high level of personal data protection and privacy, starting with a reform of the 1995 EU Data Protection Directive.”
(via eGov monitor)