The Facebook-Cambridge Analytica scandal that broke a year ago underscored the value and vulnerability of personal data in the internet age. Twelve months and many promises later, the public remains unconvinced about privacy protections amid a wave of data breaches. As sentiment sours and regulators push in, businesses — from Facebook and Google to the smallest of e-commerce firms — could be now compelled to rethink how they do business.
SINGAPORE (Apr 15): In March last year, Mark Zuckerberg posted a 937-word missive on his Facebook page, outlining what the social networking giant had done, and would be doing, to ensure that its users’ data would not be compromised or accessed by third parties that have no business doing so. “We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you,” the founder, CEO and chairman of Facebook wrote.
Zuckerberg’s post came five days after public outrage over news that millions of Facebook profiles had been harvested illegally and used to build a software to ultimately influence the outcome of the 2016 US presidential race. Cambridge Analytica — a shadowy data analytics firm, now defunct, headed then by presidential candidate Donald Trump’s key adviser Steve Bannon — was revealed to have farmed millions of profiles of US voters in order to predict voter choices at the ballot box, and hence work to send them personalised political ads.
More recently, Zuckerberg again took to his profile page. This time, it was to advocate that governments and regulators take a more active role, to “update the rules for the internet”.
“We need new regulations in four areas: harmful content, election integrity, privacy and data portability,” he wrote. “If we were starting from scratch, we wouldn’t ask companies to make these judgments alone.”
Zuckerberg’s latest declaration follows a trying year for him and his company. Not long after the Cambridge Analytica scandal, Facebook reported a massive data breach involving 50 million accounts. Attackers apparently exploited a vulnerability in the platform’s code that allowed them to gain access to user accounts and potentially take control of them.
Reaction from Zuckerberg and Facebook’s top executives may have been quicker this time around, but consumers are already tiring of such breaches of their privacy. And it is not just commercial entities that have failed to protect the integrity of the data they amass.
Early this year, in Singapore, it was discovered that the highly sensitive HIV-positive status of 14,200 people — along with other information such as their identification numbers and contact details — had been exposed online. A doctor who had access to the database had taken screenshots of the information. His former partner got hold of the data and posted it online. That breach was barely six months after the identity and medical records of 1.5 million SingHealth patients, including Prime Minister Lee Hsien Loong, were accessed, and copied, as a consequence of poor controls at SingHealth and the ministry’s IT arm.
Data deals
Little surprise then, that in a 2018 study, research firm Forrester found that users of social media platforms were fed up with data and privacy abuses. “At every turn, consumers are inundated by creepy retargeting ads, articles about privacy, and emails about companies updating their privacy policies. They’re realising that they’ve been paying for ‘free’ content with their personal data, and many are unhappy about it,” it said.
Individuals’ data — their identity, location, consumption patterns and information about myriad other behaviours and preferences — are meticulously tracked, stored, mined and monetised by tech companies, banks, retailers and just about every other business. Essentially, consumers have been “selling” their data to social media platforms and companies, but not getting paid for it.
To be sure, much of the data may well have been harvested in the exchange of goods and services. For much of the past decade, consumers willingly handed over an increasing level of detail in exchange for the multitudes of services and conveniences that define modern life. Facebook, Google, Apple, Microsoft, and Amazon help remember birthdays, organise tasks, send emails, store documents, and pay for shopping, take messages and control home appliances. Various “wearable devices” track food and water intake, and activity, and effectively control people’s daily movements in exchange for some reassurance of one’s “state of health”. Meanwhile, governments are able to amass data on the public — commuting patterns, for example — that ostensibly goes towards improving public infrastructure and services.
Simon Chesterman, dean and professor of law at the National University of Singapore, says many view personal data as a property or commodity that can be owned and sold. Companies have made and continue to make money off it. “Facebook is one example, but there are many others who make money out of the personal data of users and user behaviour,” Chesterman says. “Only half of advertising works, but no one knows which half. And so, the more you can target advertising using personal data, there’s obviously an economic benefit there.”
Mastercard, for instance, made US$3.35 billion ($4.5 billion) in FY2018 from “other revenues”, which include the sale of transaction data to retailers. In the US, the credit card company reportedly sold that data to Google, which allowed the search engine to track whether online ads led to sales at a physical store. Such deals highlight just how much consumer data companies have access to, and quietly use, for their own ends.
Still, it remains unclear exactly how much money is really being made by companies from the sale of your data. A data monetisation report by US-based market research firm Transparency Market Research estimates that the data monetisation market is expected to expand to US$708.86 billion by 2025. However, this figure includes both direct and indirect monetisation of data.
When asked if companies can quit their addiction to monetising our data, Paul Hadjy, CEO and founder of Singapore-based cybersecurity firm Horangi, says they can, but not if it is a core part of their business. He adds that companies can actually choose not to use data beyond the scope of what is necessary.
“I think there are a lot of companies that can do without selling data,” he says. “Unfortunately the way the world works, if there’s revenue [to be made], then they’re going to push towards it, and it would take legislation or a real pushback from the community to stop it.”
Indeed, amid the rash of bad news exposing lax protections and corporate irresponsibility, public sentiment towards such an exchange is rapidly souring. Scrutiny is intensifying, even before Zuckerberg suggested regulators step in. It would be only a matter of time before companies, whose businesses have benefited greatly from the so-called monetisation of personal data, be made to change the way they operate.
Some companies are already much more conscious about using personal data to generate revenue. Online shopping cashback site Shopback tells The Edge Singapore that it does not sell or reveal the data of its users to anyone, not even the partner merchants on its site. Despite having about five million members on the platform, Shopback generates its revenue from a commission on each sale made through the site.
“If you collect and retain data, you must protect it. So, the rule of thumb for us is to never collect more data than we require for our business needs. This will help us to streamline the amount of data that we store and protect,” says a representative.
Companies such as Facebook, with its scale and resources, have the chance to take the lead in championing consumer data privacy, notes Rick McElroy, head of security strategy at US-based data security firm Carbon Black. “You can already see that from their public statements. They’re a very innovative company with very smart people, and if they want to solve the problem, they have the right people to do so,” he says. “[But] well, the question is whether they’re going to want to do it.” Facebook, when approached by The Edge Singapore, declined to comment for this article.
Widespread vulnerabilities
Apart from the sale of their data, people have had to suffer the ignominy of having their information stolen for unknown, likely criminal, purposes, as in the case of the SingHealth breach. According to Carbon Black, 96% of businesses in Singapore suffered a cybersecurity breach in the last 12 months alone. The firm found that ransomware and Google Drive breaches were the most common attacks. Financial institutions were the targets of some of the most sophisticated attacks, with 63% reporting that attacks had grown increasingly sophisticated.
Compounding the issue is the paltry sum that is currently invested in shoring up defences against such attacks. A 2018 report from consulting firm AT Kearney shows that Asean countries collectively spent only 0.06% of their GDP, or just US$1.9 billion, on cybersecurity, against the global average of 0.13% of GDP. Singapore invested 0.22% of its GDP on cybersecurity in 2017, leading Asean countries and ranking third globally, but in comparison, Malaysia invested 0.08% of its GDP, with the rest of Asean members investing less than 0.04%.
This combination of rapid digitalisation and increased vulnerability, pundits say, is a recipe for disaster. “Southeast Asia’s strategic relevance and rapidly expanding digitalisation make it a prime target for cyberattacks,” warns The Asia Foundation, a non-profit international development organisation. “Asean countries have already been used as launchpads for attacks, either as vulnerable hotbeds of unsecured infrastructure or as well-connected hubs from which to initiate attacks.”
In short, as businesses of all sizes become increasingly reliant on digital data, cloud computing and workforce mobility, data breaches have become increasingly prevalent. The exposure to breaches increased as technology evolved, which impacted how companies manage personal data. At the same time, the fact that personal data is viewed as a commodity affects how it is treated.
Chris Gondek, principal architect of Commvault, a data management firm, says many companies today effectively have little idea of what data they have, and how it should be managed.
“The first big problem that companies face is that they don’t know what [data] they have, because it’s becoming increasingly difficult to know,” he tells The Edge Singapore. As recently as five years ago, companies still had information held in local data centres, until cloud services became de rigueur. Concurrently, the management of that data was transferred to managed service providers, or workers in remote offices, or into mobile devices. “This means that data became very distributed and therefore very complex to manage and keep track of,” Gondek says.
“More and more data was left in the hands of users who were not, typically, security conscious.” So, whose responsibility is it to keep data safe?
Legislation to the rescue?
There are various laws that have come into effect and compel companies to be more circumspect in how they use and treat data. The European General Data Protection Regulation, for example, which was enforced last May, encompasses far stricter, game-changing laws on the way personal data is handled, transported and stored. The highest maximum penalty for failing to comply with regulations is 4% of total annual global turnover of the company. Importantly, the extent of the law is not limited to just Europe but also to any company that deals with European persons, holds their data, or has the intention of collecting the data of a European person. This means even a Singapore-based e-commerce company that has EU customers is required to comply with the GDPR.
Earlier this year, the French authorities fined Google €50 million ($76 million) for failing to comply with GDPR provisions in not providing enough information to users about its data consent policies and not giving them enough control over how their information is used. It may seem like a huge sum, but it is a drop in the ocean compared with how severely Google could have have been fined: the maximum of 4% of its global turnover could mean billions.
“The European experience of data protection is much more driven by the primary consideration about human rights and protecting the rights of individuals,” NUS’s Chesterman says. That approach contrasts with Singapore’s Personal Data Protection Act, he adds. “The PDPA is much more explicitly intended to balance the rights of individuals against the legitimate needs of business.”
That is not to say the PDPA is not tough on companies that fail to protect user data. As a result of the SingHealth breaches, the government fined the system’s vendor, Integrated Health Information Systems (IHiS), and SingHealth, a total of $1 million — the maximum penalty under the PDPA. It also formed the Public Sector Data Security Review Committee to “conduct a comprehensive review of data security practices across the entire public service”.
In California, a law coming into effect in 2020 would be the strictest yet. The California Data Privacy Protection Act gives more power and greater agency to consumers with regard to their private data. One key point in this bill is its “Same Service” clause, where, “regardless of a consumer’s request and preferences about how their personal information is handled, businesses are required to provide ‘equal service and pricing… even if they [consumers] exercise their privacy rights under the Act’”.
Still, legislation alone cannot be the only component in protecting data. “Many think that protecting personal data is the job of the security team,” says Jinan Budge, principal analyst at US market research company Forrester. “That is far from the reality. The old cliché of ‘security is everyone’s responsibility’ is possibly more valid today than ever before.”
Budge says individuals need to be a lot more aware of whom personal data is given to, and what might be done with it. At the same time, businesses have a responsibility to identify the personal data that is in their care and determine what is personal and who has access to it and how it is protected. “Businesses certainly need to wake up and consider all implications of current data collection practices,” she warns.
Carbon Black’s McElroy argues that the onus is on companies. He says tougher legislation and increasing agency from users will force companies to rethink the way they do business. “Consumers are claiming ownership, that that’s their data and companies need to be responsible with that data from a privacy and sharing perspective and [be responsible for] how [businesses] monetise that,” he says. In turn, this forces companies to rethink not only how their technology is designed but also how the data is handled.
Indeed, companies should absolutely be the ones responsible for your data. Aadi Vaidya, chief operating officer of e-commerce platform Zilingo, says the user’s share of the burden, to him, is negligible at best.
“I feel that companies and industry giants are in a much stronger position to even extract the types of things the user probably does not want to share. And the user is probably not an expert, right? He doesn’t really know [how his data is being used]. So, I wouldn’t really put the [onus] on the user. I will say that the responsibility falls squarely on corporations,” he says.
However, Carbon Black’s McElroy does not discount the individual’s responsibility to be an advocate for his or her own data and privacy. “I think we need to stand up, and demand that our legislators create the right rules and regulations that will keep our data private and protected. And that’s going to come from us demanding it out of our leaders.
“We [also] need to [stress] cooperation across the globe. We have to fix this together because there is no single country that can fix this on its own.” McElroy adds that such an undertaking may take a generation’s time, but is nonetheless necessary.
Ultimately, it may take a fundamental shift in how data is viewed for it to be kept safe. Thinking of it as property or commodity means attaching a monetary value to it, when personal data is arguably an invaluable and inalienable part of an individual’s identity.
“If you think of personal data as not just a kind of commodity but also an inherent part of human dignity, that brings the conversation into privacy, and we don’t normally think of privacy as something that we would sell, because it is a part of your identity,” says NUS’s Chesterman. “Just as we don’t allow people to sell body parts, the argument is that personal data should be thought of in that way.
“We are at a really interesting juncture in terms of where companies will go, and what consumers, ultimately, are willing to bear.”
After all, it is inevitable that people generate data simply by existing in modern, everyday life. “You could live in a cave without a smartphone. That’s possible. But most of us would not choose to do it,” he points out.
“However, where that line is drawn is not the existence of the data, but the misuse of that data for identity theft or stolen to make your life more difficult — for example, prejudicing [against] you or embarrassing you because of what you’ve done in the past, and thus limiting what you’re able to do in the future.”