2023: A pivotal year in privacy

Post

Last week, I spoke with the National Cybersecurity Alliance about changing expectations around privacy and why 2023 will be a pivotal year for data protection across the globe.

We are living in an incredible time. The next wave of innovation is upon us, with technological advancements that will help solve society’s greatest challenges. To ensure that we realize the profound benefits of this era of rapidly advancing technology, we will need to nurture the opportunities presented and address concerns about potential harms. It will be up to all of us across industry and government to protect #privacy and other fundamental human rights and earn trust.

I addressed these issues – and much more – in my remarks last week in honor of Data Privacy Day and thought I'd share them here as well. 

Summary of my comments made during a conversation with the National Cybersecurity Alliance in the “Designing a Privacy-First World” LinkedIn Live Event on January 26, 2023.

National Cybersecurity Alliance: Microsoft operates all over the world, with both consumer and enterprise customers, we’d like to know what are you hearing from customers about privacy and what are their most pressing concerns today.  

It’s important to consider the variety of customers and their unique expectations and needs. We should consider first the customers who are individuals, such as consumers and employees. And we can separately consider commercial customers.

Consumers are telling us that they have heightened expectations of companies about privacy. They want us to be aware of their needs, including helping them understand how we are protecting their data.

They also are asking us to develop services that will keep them safe, keep their children safe and that will provide them with more control over their data. They want to engage online the same way they engage in real life, which means they have control. 

Our commercial customers are seeing a complex regulatory environment not just around privacy but across a number of regulatory issues. They want us, as key partners, to help them navigate the complexity through products, solutions, and guidance. They see the same thing I see: We are in a rapidly evolving, dynamic time for organizations.

Lawmakers and regulators are seeking to address complex problems. For example, they are concerned about the impact of technology and social media on kids, they are thinking about advertising and its impact on privacy, and so much more.

Parliaments, congresses, and state houses around the globe are offering solutions to these issues, which presents companies with the challenge of understanding this complexity. We need to simplify this landscape for our customers and help them navigate this space through product solutions and internal governance approaches.

National Cybersecurity Alliance: Privacy is clearly growing, you have a broad regulatory overview in privacy, digital safety, and AI, what should we be paying attention to in the broader landscape?

It’s important to pay attention to both the legal requirements that are being developed, and the expectations that our customers and our broader group of stakeholders have for the companies they engage with today. As we think about the regulatory landscape, and broaden the aperture beyond privacy, the first port of entry for the conversation is the European Union.  

The EU will continue to lead on a number of regulatory issues. Policymakers, regulators, and civil society in Brussels are proud of the seminal law they have created in the GDPR and its adoption in various flavors around the globe.

But these thought leaders in Brussels are also grappling with some perceived gaps in the GDPR, as well as some issues that it was never really designed to address. That is where new regulations like the Digital Services Act (DSA) and Digital Markets Act (DMA), which are two important examples of this next regulatory wave, come in. The DSA is designed to address online safety and harms, and the Digital Markets Act is designed to address competition concerns – particularly stemming from companies that serve as gatekeepers to markets that other companies want to access.

Both of these regulations, the DSA and the DMA, touch on privacy issues, for instance, advertising, but they are much more immersive and comprehensible in terms of the breadth of issues they address.

The EU Is also focusing on Artificial Intelligence (AI) with an AI Act in development. The draft EU AI Act differs in its approach from GDPR, the DMA, or DSA. In its current form, the EU AI Act examines issues from a product safety lens. However, there is an ongoing discussion about how the law should be shaped to address concerns and allow AI innovation to flourish, so it will be a while before it is adopted.

At the same time, in the US we see standards bodies and the Administration stepping in to develop frameworks for responsible AI. We’re seeing the National Institute of Standards and Technology (NIST) and the Office of Science and Technology Policy (OSTP) within the Administration developing frameworks for assessing the design and use of AI systems.

Of course we cannot describe the global regulatory landscape by just focusing on EU and U.S. We need to think globally about the kind of immersive laws that are under discussion elsewhere. There is no better example than what we are seeing in India. The government of India is advancing a comprehensive legal framework for its entire digital ecosystem, which will include rules around personal data, non-personal data, and potentially data residency and data transfers to other countries. And there is a lot of regulatory activity in South America, Australia, Canada, and African nations.

The net result is a new wave of regulation that will address AI, digital safety, security, and privacy in a complex and more immersive manner.

Companies are going to need to pay attention to this evolving and immersive environment that is right at our doorstep.

National Cybersecurity Alliance: How can a U.S. company change its perspective on global norms and from your experience, what are those best practices?

When you’re a company like Microsoft, that develops technology that can change the world, we need to ensure our products are designed responsibly and comply with local rules.

We think about helping our engineers build compliance into their products immediately. Beyond just complying with the law, we’re also focused on the expectations of our customers that go above and beyond the law in our responsible development of products.

Across key domains like privacy, security, safety, and AI, we start with internal principles that combine regulatory requirements and customer expectations. These principles are embedded in our implementation guidance for all of our engineers throughout Microsoft about how to infuse trustworthiness into our products and solutions. As an example, our Responsible AI Standard is built to ensure that AI will have a positive impact on society and people. With respect to privacy, we are guiding our engineering teams to focus on how we use data, how we can provide transparency, and building user controls across our ecosystem.

In addition to all of this, you need to think about the culture inside your company.

One lesson we learned at Microsoft as we rolled out our compliance with GDPR was the need to embrace the EU’s concept of personal data. The U.S. system operated through the lens of PII, or  Personally Identifiable Information. But in the EU and in the context of GDPR, “personal information” is a much broader concept. In order to be successful in meeting the requirements of the law and customer expectations, we needed to think about how to shift our culture towards the EU model, including the focus on the broader set of data that incorporates “personal information”. We took the time to meet with regulators and key customers in the EU to gain an understanding of this issue, and we then layered this new cultural lens into our systems as well by infusing the EU concept of personal data into our solutions.

National Cybersecurity Alliance: 2023 is clearly a pivotal year in privacy – what do you want people to know about privacy at this moment in time?