Introduction to Privacy by Design

Privacy by Design

In the digital age, personal data is one of the most valuable and sensitive commodities. From mobile applications to online shopping and social networks, countless systems handle vast amounts of user data every second. This environment calls for a proactive, ethical, and deeply embedded approach to privacy protection. Enter Privacy by Design — a framework that reimagines privacy not as an add-on or regulatory checkbox but as a foundational component of technology and business operations.

Developed by Dr. Ann Cavoukian, former Information and Privacy Commissioner of Ontario, Privacy by Design has shaped privacy thought leadership since the 1990s. It’s increasingly relevant today, especially with the rise of data protection laws such as the EU’s General Data Protection Regulation (GDPR), Brazil’s LGPD, and California’s CCPA. This framework presents a strategic approach that places user privacy at the forefront of innovation, design, and implementation.

This article delves into the origins of Privacy by Design, its necessity in the modern world, and the first two of its seven foundational principles.

Why Privacy by Design Matters

Modern organizations operate in a world where data drives decisions, innovation, and competitive advantage. However, the misuse or mishandling of personal information can result in not just legal consequences but also massive reputational damage. As cyberattacks become more frequent and user expectations for transparency grow, the importance of embedding privacy in the foundation of systems cannot be overstated.

Most traditional approaches to privacy tend to be reactive. They address problems only after they occur—after a data breach, a whistleblower revelation, or regulatory scrutiny. These methods are costly, inefficient, and erode trust. Privacy by Design flips this mindset by advocating for prevention and resilience. It encourages designing products, services, and systems with privacy considerations from the very beginning.

Moreover, trust is a key differentiator in today’s market. Customers are more likely to engage with companies they believe will safeguard their data. Organizations that commit to Privacy by Design not only comply with the law but also create trust-driven relationships that can endure.

The Origins and Global Adoption of Privacy by Design

Privacy by Design emerged in the 1990s when technological innovation began to rapidly outpace existing data protection legislation. Dr. Ann Cavoukian introduced this model to push developers, organizations, and policymakers to think beyond compliance.

Over time, international organizations and governments recognized the value of the framework. In 2010, the International Assembly of Privacy Commissioners and Data Protection Authorities unanimously passed a resolution to incorporate Privacy by Design into privacy legislation globally. The GDPR, enacted in 2018, solidified this concept under the terms “data protection by design and by default,” embedding these ideas into European legal obligations.

This widespread endorsement reflects the urgent need for a privacy-forward mindset—one that anticipates risks and embeds resilience in every aspect of system development.

Principle 1: Proactive Not Reactive; Preventative Not Remedial

The first principle establishes the proactive nature of Privacy by Design. Instead of waiting for privacy threats to arise and responding afterward, this approach calls for anticipation and prevention.

In practice, this means identifying privacy risks early in the system development lifecycle and designing controls to mitigate those risks from the outset. For example, during the design phase of a mobile health application, developers should already be assessing how user data will be stored, who will have access, and what encryption mechanisms will be in place.

Proactive measures might include conducting privacy impact assessments (PIAs), implementing early-stage threat modeling, and having structured privacy engineering practices. The goal is to design systems that inherently resist data breaches and misuse.

By adopting a proactive stance, organizations reduce the likelihood of privacy violations and the fallout that comes with them. It also sets a cultural tone—one that values foresight, responsibility, and the ethical handling of data.

Real-world example

Consider a smart thermostat company that designs its product to collect minimal user data, limits location tracking, and ensures all data is encrypted before it is transmitted. These features are embedded before the product reaches the market, thereby preempting potential privacy concerns and increasing consumer trust.

Principle 2: Privacy as the Default Setting

Privacy as the default means that individuals’ personal data is automatically protected in any system or service, without the need for them to take action. Users should not have to navigate complex settings to secure their privacy—it should be guaranteed from the moment they interact with a product or service.

This principle is particularly significant in user interface and user experience (UI/UX) design. Defaults matter because most users rarely change them. A privacy-conscious design ensures that the least amount of data is collected, processed, and retained, all without user intervention.

Under this principle, systems should operate with data minimization and purpose limitation in mind. Only the data strictly necessary for a given task should be collected, and it should not be repurposed for something else without explicit consent.

Application in digital platforms

A social networking site might default user profiles to private and provide granular controls to let users decide what to share and with whom. An e-commerce platform could anonymize user data by default unless the user opts in for personalized recommendations. A health app might store user health data locally on the device rather than uploading it to cloud servers unless explicitly approved.

Legal implications

This principle aligns directly with Article 25 of the GDPR, which states that data controllers must implement “data protection by default.” Organizations that do not follow this principle may find themselves in breach of legal requirements, facing penalties and regulatory investigations.

Building a Culture of Privacy

Adopting the first two principles of Privacy by Design—being proactive and making privacy the default—requires more than technology. It demands a shift in organizational mindset. Leadership must prioritize privacy and foster a culture where it is everyone’s responsibility, from engineers to marketers.

Training and awareness are critical. Teams should understand not only the legal requirements but also the ethical implications of mishandling data. Privacy must be treated as a shared value, not just a compliance obligation.

Privacy champions or privacy officers can play a pivotal role in advocating these principles across departments. They help ensure that projects are aligned with the organization’s privacy values from ideation to implementation.

Tools and Techniques to Support Implementation

To effectively implement the proactive and default privacy principles, several tools and frameworks are available:

  • Privacy Impact Assessments (PIAs): These help identify privacy risks early in a project and recommend measures to mitigate them.
  • Data Flow Mapping: Understanding where data travels within and outside the system helps identify areas where leaks or over-collection may occur.
  • User Journey Mapping: Aligning privacy checkpoints with user touchpoints can improve the way privacy defaults are presented.
  • Consent Management Platforms (CMPs): These enable organizations to manage user preferences in line with regulatory requirements while offering transparency.

These tools, when combined with a privacy-conscious culture, make proactive and default-based privacy not just possible but practical.

Benefits Beyond Compliance

Organizations often view privacy as a legal burden, but Privacy by Design flips this narrative. By embedding privacy early, companies benefit in multiple ways:

  • Cost Savings: Preventing privacy issues is far cheaper than fixing them post-incident.
  • Trust and Loyalty: Consumers are more likely to remain loyal to brands that respect their privacy.
  • Innovation Enablement: Designing with privacy in mind can lead to better, more efficient, and user-respecting products.
  • Competitive Edge: As privacy becomes a key differentiator, companies that lead in this space often outperform those that lag.

Challenges and Misconceptions

Despite its clear advantages, implementing Privacy by Design is not without challenges. Common misconceptions include:

  • Believing privacy limits innovation. In reality, it inspires better design.
  • Thinking it’s too costly. While there may be upfront investment, it pays off long term.
  • Assuming it’s a one-time fix. Privacy by Design is a continuous process that evolves with technology and user expectations.

Overcoming these myths requires education and commitment. When stakeholders understand the long-term benefits, they’re more likely to embrace the model.

As technology becomes more complex and interconnected—from IoT to AI and beyond—the demand for trustworthy, secure systems will only grow. Privacy by Design provides a future-proof foundation to build those systems with ethics, transparency, and user empowerment at the core.

In upcoming explorations, we’ll dive deeper into the remaining principles, each of which adds a layer of depth and nuance to the Privacy by Design framework. These principles will include how to embed privacy directly into system architecture, achieve positive-sum outcomes rather than trade-offs, and maintain transparency and user-centricity.

By taking a proactive and default-first stance on privacy, we can create a digital world that’s not only more secure but also more humane. Privacy is not a barrier to innovation—it’s a beacon guiding its responsible evolution.

Continuing the Journey with Privacy by Design

Building on the foundational understanding of Privacy by Design and its first two principles—being proactive and making privacy the default—we now delve into the next three principles. These principles guide how privacy should be integrated directly into the system architecture, how to create win-win outcomes rather than trade-offs, and how to ensure data protection throughout the lifecycle of information.

These ideas elevate privacy from a mere policy concern to a key design consideration, influencing every aspect of technology development and deployment. As organizations mature in their privacy practices, they realize that real-world implementation of these principles is both a technical and cultural journey. The goal remains consistent: embed privacy into the DNA of every product, process, and system.

Principle 3: Privacy Embedded into Design

This principle establishes that privacy must be an integral part of system architecture and business practices—not something added after development. Privacy by Design isn’t about reacting to breaches or legal mandates. Instead, it requires privacy to be a core design feature, embedded at every stage of the product or system lifecycle.

To embed privacy into design means considering privacy concerns in the earliest stages of conceptualization, planning, and development. Engineers, UX designers, data scientists, and business strategists must work collaboratively to ensure privacy is not tacked on later but baked in from the start.

Key strategies for embedding privacy into design

  1. Data Minimization Techniques
    Collect only the data that is absolutely necessary for the purpose at hand. Avoid speculative data gathering that may lead to misuse or unnecessary risk.
  2. Pseudonymization and Anonymization
    Use techniques that reduce data identifiability. Replace personal identifiers with artificial identifiers or remove them entirely where possible.
  3. Access Controls
    Limit data access to only those who need it. Use role-based access, multifactor authentication, and audit logs to track and monitor access.
  4. Encryption and Secure Communication
    Build strong encryption protocols directly into data storage and transmission systems. These should be implemented as part of the design rather than added later.
  5. Modular System Design
    Design systems with modularity so that if one part is compromised, it doesn’t expose the entire system.

Case example: Embedded privacy in smart home devices

A smart home company wants to release a new voice assistant. Rather than storing every voice interaction on the cloud, the system processes basic commands locally and uploads data only when user consent is explicitly given. Furthermore, microphone access is turned off by default and physically indicated by a light when in use.

By embedding privacy into design, the company reduces risk while also increasing consumer confidence and regulatory compliance.

Principle 4: Full Functionality — Positive-Sum, Not Zero-Sum

Privacy by Design emphasizes that it’s not necessary to trade privacy for other functionalities like security, usability, or business efficiency. Instead of a zero-sum mindset (where one gain comes at another’s loss), it promotes a positive-sum approach where multiple objectives can be achieved without compromise.

In practice, this principle urges teams to reject the false dichotomy of “privacy versus profit” or “security versus usability.” With creative and thoughtful design, it is possible to meet privacy requirements while still achieving functionality, innovation, and competitiveness.

Practical examples of positive-sum design

  1. Privacy and Security
    Privacy and security are often seen as synonymous, but while they overlap, they have different objectives. Security protects data from unauthorized access, while privacy governs how that data is collected, used, and shared. A well-designed system ensures both through techniques like differential privacy, secure multiparty computation, and zero-knowledge proofs.
  2. Personalization and Data Minimization
    Many digital services rely on personalization to enhance user experience. Rather than collecting extensive personal data, companies can use client-side processing, local storage, or contextual signals to tailor services without sending data to centralized servers.
  3. Marketing and Consent
    A company can run successful marketing campaigns by targeting broad segments using anonymized or consent-based data rather than intrusive tracking or surveillance techniques.

Design thinking and multidisciplinary collaboration

Achieving positive-sum outcomes requires interdisciplinary collaboration. Engineers, designers, marketers, and legal professionals must jointly explore solutions that respect privacy without hindering innovation. Tools such as design sprints, ethical design workshops, and privacy risk mapping help foster this collaboration.

A design thinking mindset encourages asking not “How do we comply?” but “How do we build trust while exceeding user expectations?” This reframing leads to smarter, more sustainable solutions.

Principle 5: End-to-End Security — Full Lifecycle Protection

Data protection doesn’t end at the point of collection. Privacy by Design demands strong security measures that protect data throughout its entire lifecycle—from initial acquisition and processing to storage, access, and eventual disposal.

End-to-end security means safeguarding data not only from hackers but also from internal misuse, accidental leaks, and unauthorized sharing. It encompasses the entire information lifecycle and includes both technical safeguards and organizational practices.

Key components of full lifecycle protection

  1. Secure Collection
    Collect data over secure channels (e.g., HTTPS, TLS). Authenticate sources and validate data inputs to prevent injection attacks and data poisoning.
  2. Secure Processing
    Apply processing techniques like homomorphic encryption or differential privacy where applicable. Maintain data integrity and prevent manipulation.
  3. Secure Storage
    Use encryption at rest and access controls to ensure data is protected in storage. Implement redundancy and backup measures to prevent data loss.
  4. Data Retention and Minimization
    Only retain data as long as necessary. Establish clear retention schedules and deletion protocols. Automate deletion to avoid human error.
  5. Secure Disposal
    When data is no longer needed, securely delete or destroy it. Use tools that overwrite or erase data completely to prevent recovery.

Compliance and global expectations

Regulations like the GDPR, HIPAA, and CCPA emphasize the importance of data lifecycle management. Article 32 of the GDPR, for instance, mandates that organizations implement appropriate technical and organizational measures to ensure data security.

Failing to protect data throughout its lifecycle can lead to fines, loss of consumer trust, and reputational damage. In contrast, demonstrating end-to-end data stewardship builds organizational credibility and customer loyalty.

Real-world example: Lifecycle protection in financial services

A fintech company handling sensitive financial data implements lifecycle protection by encrypting data at collection, restricting access to authorized personnel, maintaining immutable logs of all access attempts, and automatically deleting customer data after a specified period in accordance with retention policies. These measures reduce liability and establish trust.

Organizational Implications of Implementing These Principles

Applying the three principles discussed—embedded design, full functionality, and lifecycle protection—requires structural and cultural adjustments within organizations. Privacy by Design must move beyond policy and into practice.

Building cross-functional teams

Privacy should not be siloed within the legal or IT department. It needs champions across departments, including marketing, HR, customer support, and development teams. Cross-functional privacy teams ensure that privacy is addressed from multiple angles and is not overlooked in favor of other priorities.

Integrating privacy into development workflows

Privacy must be incorporated into Agile, DevOps, and continuous integration/continuous deployment (CI/CD) pipelines. This involves:

  • Adding privacy reviews to sprint planning and retrospectives
  • Including privacy checks in automated testing scripts
  • Conducting privacy threat modeling alongside security assessments

By making privacy a routine part of development, teams ensure that it becomes a standard consideration, not an afterthought.

Policy and governance alignment

To support embedded privacy, organizations must update their internal policies, data handling procedures, and third-party contracts. This includes:

  • Establishing clear data governance frameworks
  • Vetting third-party vendors for privacy practices
  • Aligning internal privacy policies with design and engineering practices

Transparency and accountability structures—like privacy dashboards and internal audits—reinforce these governance efforts.

Technology Trends Influencing These Principles

As technology evolves, new tools and frameworks are emerging to support Privacy by Design.

  1. Privacy Engineering
    A growing discipline that combines software engineering with privacy principles. Tools and libraries for privacy-preserving computation, such as TensorFlow Privacy or OpenMined, are enabling developers to embed privacy directly into AI/ML models.
  2. Edge Computing
    Processing data on local devices instead of the cloud helps achieve data minimization and lifecycle protection, as fewer data points are transmitted and stored.
  3. Zero Trust Architectures
    An approach where no user or device is automatically trusted. Zero trust frameworks help enforce strict access controls and validate each request.
  4. Decentralized Identity
    Shifting identity control back to users via blockchain and other decentralized technologies supports privacy as the default and lifecycle protection.

These trends illustrate how privacy can be synergistic with cutting-edge innovation rather than at odds with it.

Challenges and How to Overcome Them

Organizations may face resistance when adopting Privacy by Design principles. Common challenges include:

  • Resource constraints: It takes time and investment to train staff, adapt workflows, and adopt new tools.
  • Complex legacy systems: Older systems may not support modern privacy features without significant overhaul.
  • Cultural inertia: Shifting organizational priorities to value privacy as a design goal can face pushback.

To overcome these, leadership must advocate for privacy as a long-term asset rather than a short-term cost. Pilot projects, success stories, and measurable ROI can help build momentum and justify the investment.

The Bigger Picture

Privacy by Design is not just a technical framework—it’s a philosophy rooted in ethics, responsibility, and respect for human dignity. The principles discussed here help organizations build products and services that respect user autonomy while also achieving performance, growth, and innovation.

These practices position organizations as privacy leaders, earn user trust, and contribute to a healthier digital ecosystem. As privacy threats grow more complex, so too must our responses evolve. These principles guide that evolution—one system, one decision, one line of code at a time.

In the next section, we will explore the final two principles of Privacy by Design: transparency and user-centricity. These complete the framework by ensuring that users remain informed, empowered, and at the center of all data-related decisions. Together, these seven principles form a comprehensive roadmap to responsible and ethical data stewardship in the digital age.

Completing the Framework: Transparency, User-Centricity, and the Future of Privacy by Design

Privacy by Design is not simply a set of technical standards or regulatory obligations—it is a vision for how systems, products, and services should treat personal data with dignity, respect, and integrity. After examining the first five principles—proactive design, default privacy, embedded architecture, positive-sum outcomes, and full lifecycle protection—we now explore the final two: visibility and transparency, and respect for user privacy.

These last two principles focus not on what systems do, but how users understand, control, and trust them. They are about communication, ethics, and empowering the individual. As we navigate a world of AI, biometrics, and smart devices, these user-facing principles are more crucial than ever.

Principle 6: Visibility and Transparency — Keep It Open

This principle emphasizes that all business practices and technologies should operate according to an open, transparent process. Users should not be kept in the dark about how their data is collected, processed, and stored. Instead, they should have access to clear, accurate information that helps them make informed choices.

Transparency isn’t just about publishing a privacy policy—it’s about meaningful disclosure. It means giving users insight into how their data is handled, and allowing regulators, auditors, and stakeholders to assess and verify those processes.

The need for visibility in complex systems

Modern data systems—especially those involving AI and machine learning—can be highly opaque. Algorithms make decisions about credit scores, job applications, insurance rates, and more. When people are affected by automated decisions, they should have a right to understand the logic behind them.

Transparent systems are:

  • Accountable: Others can evaluate whether data practices meet legal and ethical standards.
  • Trustworthy: Users are more likely to engage with systems they understand.
  • Resilient: Issues can be discovered and corrected more easily in open environments.

Tools and strategies for achieving transparency

  1. Clear and layered privacy notices
    Present information in layers—offer simple, high-level summaries with links to deeper legal explanations. Use plain language and visuals where appropriate.
  2. Consent dashboards and privacy settings
    Provide accessible tools for users to review what they’ve agreed to, and allow them to change preferences easily.
  3. Audit logs and reporting tools
    Maintain detailed logs of how data is accessed and by whom. These are critical for internal governance and for external regulators or auditors.
  4. Algorithmic transparency
    Offer explanations for automated decisions, especially when they impact rights or opportunities. Consider using model cards, explainable AI (XAI), and fairness audits.

Real-world example: Transparent healthcare platforms

A telehealth provider offers patients a dashboard that shows what data is collected during visits, how it’s stored, and who has accessed it. The platform also sends alerts whenever new access occurs. By making its data handling practices visible, the provider builds a higher level of trust with users.

Legal and regulatory reinforcement

Transparency is a cornerstone of modern data protection laws. The GDPR, for example, mandates transparency in Article 12–14, requiring organizations to inform individuals about data processing activities in a concise, intelligible, and easily accessible form. Similar requirements appear in California’s CCPA, Brazil’s LGPD, and other global frameworks.

Organizations that fail to meet these standards may face fines, legal action, and reputational damage. But beyond compliance, transparency fosters sustainable relationships with users.

Principle 7: Respect for User Privacy — Keep It User-Centric

The seventh and final principle of Privacy by Design asserts that systems must be designed around the user. This means giving users agency over their data and prioritizing their interests and expectations. It’s about putting people before processes and treating privacy as a human right rather than a technical feature.

Respect for user privacy means:

  • Empowering individuals to control their data
  • Designing intuitive, respectful experiences
  • Building trust through consent, clarity, and choice

Designing for user empowerment

  1. User-friendly interfaces for privacy management
    Make it easy for users to understand and change privacy settings. Interfaces should not be manipulative or use dark patterns to coerce consent.
  2. Granular consent mechanisms
    Allow users to choose what types of data they want to share and for what purposes, rather than using all-or-nothing approaches.
  3. Right to access, correct, and delete
    Ensure users can review their data, correct inaccuracies, and request deletion without friction.
  4. Feedback loops and user involvement
    Invite user feedback on privacy practices, and involve real users in usability testing for privacy controls.

Ethical considerations

Respecting user privacy also involves acknowledging cultural, regional, and personal differences in privacy expectations. What feels invasive in one culture may not in another. Organizations must be sensitive to these variations and avoid a one-size-fits-all model.

Ethical design considers:

  • Contextual integrity: The appropriateness of data collection and use depends on context.
  • Power imbalances: Recognizing that users often don’t fully understand or control how their data is used.
  • Long-term consequences: Designing with foresight about how data practices affect future outcomes.

Example: Respect in social media platforms

A social media app redesigns its privacy settings to be more granular and visible. Instead of hiding options in multiple layers of menus, the platform introduces a privacy onboarding process that walks users through key decisions during signup. It also allows users to download all their data and delete their account with a single button.

This not only improves user experience but significantly reduces privacy complaints and boosts trust metrics.

Combining the Principles for Holistic Privacy

The seven principles of Privacy by Design are not meant to stand alone. They interlock, each reinforcing the others to form a robust, ethical, and sustainable privacy framework.

When organizations apply all seven principles together, the results are transformative:

  • Systems are resilient because they anticipate problems and include built-in safeguards.
  • Users feel respected and informed, enhancing trust and long-term engagement.
  • Innovation is not stifled but enhanced through ethical, inclusive design.
  • Regulatory compliance becomes a natural outcome, not a forced obligation.

Synergy in action: Building a privacy-first mobile banking app

Imagine a fintech startup building a mobile banking app from scratch. Applying the seven principles might look like this:

  • Proactive: They perform privacy risk assessments during the planning phase.
  • Privacy by default: Only essential data (like identity and transaction history) is collected.
  • Embedded into design: Data encryption, user authentication, and secure APIs are built from day one.
  • Positive-sum: The app offers strong security and convenience without compromising privacy.
  • Lifecycle protection: Data is encrypted, stored securely, and automatically deleted when no longer needed.
  • Transparency: Users get a clear dashboard showing data usage, access history, and consent options.
  • User-centricity: Privacy settings are customizable, easy to access, and respectful of user choices.

This integrated approach not only delivers a secure product but positions the startup as a trusted player in a competitive market.

Implementing Privacy by Design in Practice

Organizations often ask: how do we begin applying Privacy by Design? Here’s a roadmap to transition from theory to implementation.

Step 1: Leadership buy-in

Organizational change begins at the top. Executives and stakeholders must understand the value of privacy—not just for compliance but as a long-term strategic asset.

  • Present case studies showing ROI of privacy investments
  • Highlight legal, reputational, and operational risks of neglect

Step 2: Conduct a privacy audit

Understand your current posture. Identify what data you collect, how it flows through your systems, where it’s stored, who has access, and what controls are in place.

  • Use data mapping tools
  • Involve cross-functional teams to cover technical and business processes

Step 3: Integrate privacy into development workflows

Adopt privacy engineering practices:

  • Include privacy requirements in product specs
  • Run privacy impact assessments (PIAs)
  • Use secure coding and privacy-enhancing technologies (PETs)

Step 4: Foster a privacy-aware culture

  • Offer training and awareness sessions
  • Appoint privacy champions in each department
  • Encourage open dialogue about ethical data use

Step 5: Monitor, improve, and adapt

Privacy is not a one-time project—it’s a continuous process. Monitor data practices, audit systems regularly, and evolve based on feedback and new risks.

The Future of Privacy by Design

As we move deeper into the age of artificial intelligence, IoT, and augmented reality, the challenges surrounding privacy will multiply. Predictive algorithms, biometric surveillance, and real-time tracking are raising new ethical and legal questions. But these technologies also offer an opportunity to reimagine how privacy can be protected at scale.

Emerging trends that will shape the future of Privacy by Design include:

  • Federated learning: Training machine learning models without centralizing user data
  • Privacy-preserving computation: Techniques like homomorphic encryption and secure enclaves
  • Decentralized identity management: Giving users control over their digital credentials
  • Ethical AI frameworks: Embedding fairness, accountability, and transparency into algorithms

Privacy by Design will remain a guiding framework for navigating this landscape. Its principles offer a moral and operational compass in an era where the line between innovation and intrusion is increasingly blurred.

Conclusion

The journey through Privacy by Design reveals a clear truth: privacy is not a barrier to innovation, but a foundation for trust, ethics, and sustainability. The seven principles—proactive design, default settings, embedded privacy, positive-sum outcomes, lifecycle protection, transparency, and user-centricity—offer a powerful roadmap for responsible data stewardship.

In an age where technology increasingly mediates our relationships, choices, and identities, building systems that honor privacy is not just a technical goal but a societal imperative. Organizations that embrace these principles don’t just comply with laws—they lead with integrity, earn trust, and shape a more humane digital future.

As individuals, advocates, and professionals, the power to design for privacy is in our hands. The future will be shaped not just by what we build, but by how—and for whom—we build it.