The Security Interviews: Protecting Your Digital Self

by admin
The Security Interviews: Protecting Your Digital Self
The Security Interviews: Protecting Your Digital Self

[ad_1]

As our lives become more interconnected and both real and online activities are recorded, we generate increasingly complex records of who we are. This digital self is a virtual representation of our lives that people use as a basis for judgment, rather than meeting in person, and that algorithms use to inform their responses to that person.

“These days, nobody’s life is just offline,” says Ben Graville, founder of Visible. “As we go about our daily lives, whether we like it or not, a side effect of our conscious use of technology is an unconscious data trail that leaves a digital shadow – a detailed representation in data of who we are, how we think and the things we do. It’s a manifestation of us, but we didn’t know we were leaving.”

Our digital self is a virtual footprint that acts as a digital trace of our online life that exists long after we’re gone. It is comparable to physical body language, which covers more than half of how we communicate.

The digital self is generated through social media and online activities using content (what we post, such as social media, blog posts and playlists) and associated metadata (where, when and how we post, as well as frequency). In many cases when creating a virtual profile, the metadata can be just as powerful as the content itself.

“Social media is the tip of the iceberg,” Graville says. “That’s the most obvious because that’s where your human interactions are. There are probably more decisions made for you by algorithms than by humans.

The Internet never forgets, and it is this persistence that makes it so powerful in generating a complex representation of our lives. Even after a website or service ends, Internet archives ensure that nothing is truly gone.

Our digital selves may not be a true representation of who we are. Due to the anonymous nature of the internet, there can be a temptation to share exaggerated or extreme posts that may be intended as a joke or to spark discussion. However, since these posts remain forever, the original context may be unclear and the posts may not be received as originally intended.

The persistent nature of the internet means that over time we generate vast amounts of online data from which our digital selves can be formed. Because this virtual fingerprint is freely shared and distributed, it can be used by companies to assess job applicants for their suitability. Of course, digital identities can also be used by criminals, such as to identify when people are away from home. It can also be used for research purposes, for example investigative journalism.

Over time, we generate vast amounts of online data from which our digital selves can be formed. Because this virtual fingerprint is freely shared and distributed, it can be used by companies to assess job applicants for their suitability

Organizations are also using machine learning algorithms to automate decision making using publicly available data. For example, the Department for Work and Pensions (DWP) is starting to use machine learning as a tool to help identify fraudulent Universal Credit claimants.

Social media algorithms also use the digital selves of their users to curate content by identifying the content (posts, articles and ads) they are most likely to engage with. This effectively creates echo chambers for their users, reinforcing their worldviews and political biases. As a result, it may not be the content they need to see to maintain their sanity or make rational judgments.

“We know that algorithms are built to increase engagement and get your attention and their confrontational bias,” Graville says. “One of the main reasons why the center of politics has disappeared online is that you essentially have algorithms that decide how to adjust your worldview based on your digital self, thinking they were helpful. But actually, for society as a whole, it’s probably not that helpful.

Own your digital self

By taking ownership of our digital selves, people can ensure they don’t inadvertently misrepresent themselves online. Learning how they are represented in the data allows them to understand the data that drives decisions about them. From there, people can manage how they look to showcase their qualities and improve their online profile.

“Understanding how your digital body language is interpreted is critical to one’s well-being and success in the real world,” Graville says. “We have more than one sense to be able to judge people, we have sight and hearing, but when you do most of your business online, you don’t have the luxury of those other senses to make those decisions in a human context.

To properly manage your digital self, you must first have an understanding of how a person’s virtual representation is currently viewed. Ideally, this is done by an independent outside viewer who has no pre-existing biases that can color their perspective.

“As we go about our daily lives, whether we like it or not, a side effect of our conscious use of technology is an unconscious data trail that leaves a digital shadow—a detailed representation in data of who we are, how we think, and the things we do.”

Ben Graville, Visible

There are already tools, such as Visible, being developed to provide an overview of the digital self. Because these apps use data from the same publicly available algorithms that generate a profile, they can offer an unbiased representation.

“We see ourselves as a deep technology company as it is fully decentralized and unified AI [artificial intelligence] approach,” Graville explains. “The data on your devices does not leave your devices except to talk to the service you are trying to talk to. We don’t feel like it. Visible runs locally on your machine; no cloud infrastructure or sharing of personal information.”

Being aware of their digital self and how it is perceived allows people to recognize the driving forces behind that perception. This helps them become better able to change their online behavior to present a truer image of who they are. Online behavior is considered not only in terms of the content posted on social media, but also the timing and frequency of interactions and the devices used.

“Seeing the basic data that’s out there — which might be something around your demographics, where you live, your age, your online activity, the way you talk, the way you share things, what you say, the things that others say people around you (guilt by association) and these concepts – it enables people to understand how their digital selves would have formed.”

There is also the option of taking steps to mitigate distortions in the digital self by deleting distant historical social media posts. While the internet never forgets, the impact of these posts can be diminished. All of this will change how algorithms perceive people and in turn represent how they appear online.

One technique that can be helpful in managing our virtual representations is comparing the digital self to how others present themselves online. This is the opposite of peer pressure, where comparing your digital self to those of your peers will form a baseline expectation of what to expect, as well as how to stand out for the right reasons.

UK data protection regulations are currently being overhauled to allow for greater use of user data and thus enable the country to become a hub for AI and machine learning research. “Since we’ve left the EU, the government has taken the chance to review our data protection law,” Graville says. “They are considering removing some of the safeguards around data protection and machine decision-making, which would make it easier for AI to flourish in the UK.”

By understanding how they are perceived online, people can take control of their digital selves to ensure their virtual representation is a true reflection of who they are, promoting the qualities they most want to display

This will enable greater sharing of personal data and in turn mean that people’s digital shadows will become an increasingly complex set of data networks.

The principle of net neutrality is a powerful foundation of the Internet. However, the internet has evolved to become biased towards business. Personal data can be freely shared, but also exploited. Just as we can’t avoid communicating body language in a physical situation, it’s impossible not to share our digital selves. However, psychological ownership and knowing how their data is used and monetized allows people to change their online behavior to avoid exploitation of their data.

“To stop us from entering a dystopian world in the future, people need to feel empowered to own their identities and digital selves and use that to make the internet a fairer place where people can benefit so much from digital you’re me, however many businesses you make,” Graville says.

The free and open nature of the internet means we can’t avoid sharing our data and still be online. As such, our digital self will continue to offer a reflection of who we are, no matter how inaccurate that image may be. By understanding how they are perceived online, people can take control of their digital selves to ensure their virtual presentation is a true reflection of who they are, promoting the qualities they most want to display.

[ad_2]

Source link

You may also like