Data Protection¶
2025.2
CAP Index takes the confidentiality and integrity of its customer data very seriously. As stewards and partners of CAP Index Customers, we strive to assure data is protected from unauthorized access and that it is available when needed. The following policies drive many of our procedures and technical controls in support of the CAP Index mission of data protection.
Production systems that create, receive, store, or transmit Customer data (hereafter “Production Systems”) must follow the requirements and guidelines described in this section.
Policy Statements¶
CAP Index policy requires that:
(a) Data must be handled and protected according to its classification requirements and following approved encryption standards, if applicable.
(b) Whenever possible, store data of the same classification in a given data repository and avoid mixing sensitive and non-sensitive data in the same repository. Security controls, including authentication, authorization, data encryption, and auditing, should be applied according to the highest classification of data in a given repository.
(c) Workforce members shall not have direct administrative access to production data during normal business operations. Exceptions include emergency operations such as forensic analysis, analytics, customer reporting, and manual disaster recovery.
(d) All Production Systems must disable services that are not required to achieve the business purpose or function of the system.
(e) All access to Production Systems must be logged, following the CAP Index Auditing Policy.
(f) All Production Systems must have security monitoring enabled, including activity and file integrity monitoring, vulnerability scanning, and/or malware detection, as applicable.
Controls and Procedures¶
Data Protection Implementation and Processes¶
Data is classified and handled according to the CAP Index Data Handling Specifications and Data Classification document.
Critical, confidential and internal data will be tagged upon creation, if tagging is supported. Each tag maps to a data type defined in the data classification scheme, which then maps to a protection level for encryption, access control, backup, and retention. Data classification may alternatively be identified by its location/repository. For example, source codes in CAP Index’s Team Foundation Services repos are considered “Internal” by default, even though a tag is not directly applied to each source file.
Critical and confidential data is always stored and transmitted securely, using approved encryption standards. More details are specified in CAP Index’s Data Classification and Handling document.
All IT systems that process and store sensitive data follow the provisioning process, configuration, change management, patching and anti-malware standards as defined in Configuration and Change Management document.
Customer/Production Data Protection¶
CAP Index hosts on Azure Cloud Services in the US-East (Virginia) region by default. Data is replicated across multiple regions for redundancy and disaster recovery.
All CAP Index employees, systems, and resources adhere to the following standards and processes to reduce the risk of compromise of Production Data:
- Implement and/or review controls designed to protect Production Data from improper alteration or destruction.
- Conduct Data Protection Impact Assessmnets (DPIAs) for new projects, systems, or process that involve the processing of personal data, particularly when such processing is likely to result in high risks to the rights and freedoms of individuals.
- Ensure that confidential data is stored in a manner that supports user access logs and automated monitoring for potential security incidents.
- Ensure CAP Index Customer Production Data is segmented and only accessible to Customer authorized to access data.
- All Production Data at rest is stored on encrypted volumes using encryption keys managed by CAP Index or keys managed by Microsoft. Encryption at rest is ensured through cloud service configuration policies referenced in Configuration and Change Management.
- Volume encryption keys and machines that generate volume encryption keys are protected from unauthorized access. Volume encryption key material is protected with access controls such that the key material is only accessible by privileged accounts.
- Encrypted volumes use approved cipher algorithms, key strength, and key management process as defined in §12.3.1 above.
Access¶
CAP Index employee access to production is guarded by an approval process and by default is disabled. Production access is reviewed by the security team on a case by case basis.
Separation¶
Customer data is logically separated at the database/datastore level using a unique identifier for the institution. The separation is enforced at the API layer where the client must authenticate with a chosen institution and then the customer unique identifier is included in the access token and used by the API to restrict access to data to the institution. All database/datastore queries then include the institution identifier.
Customer Data Anonymization¶
Anonymization is a key part of CAP Index’s data protection strategy to ensure the privacy and security of individuals’ information. Anonymizations procedures are described below:
-
Data Identification: We begin by identifying the personal data that needs to be anonymized. This includes data collected directly from individuals as well as data generated through their interactions with our services.
-
Anonymization Techniques: We employ various anonymization techniques to ensure that personal data cannot be traced back to an individual. These techniques include:
- Data Masking: Obscuring specific data elements to prevent identification.
- Generalization: Reducing the precision of data to make it less specific.
- Pseudonymization: Replacing private identifiers with fake identifiers or pseudonyms.
- Data Aggregation: Combining data in a way that individual details are not discernible.
-
Compliance with Regulations: Our anonymization practices are in compliance with relevant data protection regulations, such as GDPR and CCPA, which set standards for data anonymization to ensure privacy protection.
-
Regular Review and Updates: Anonymization processes and techniques are regularly reviewed and updated to align with the latest regulatory requirements and industry best practices. CAP Index conducts periodic audits to verify the effectiveness of our anonymization measures and make necessary adjustments.
-
Employee Training: Employees involved in data handling and anonymization processes are provided with specialized training to ensure they understand and correctly implement anonymization techniques.
-
Documentation: We maintain detailed documentation of our anonymization processes, including methodologies used and data sets anonymized. This documentation is available for review by regulatory authorities if required.
Routine Assessment of Personal information¶
By routinely assessing our data retention practices, CAP Index ensures that personal information is not retained longer than necessary, thereby minimizing risks related to data breaches, non-compliance, and unnecessary data storage costs.
- Retention Policy:
-
CAP Index has a comprehensive data retention policy that defines the retention periods for various types of personal data based on legal, regulatory, and business requirements.
-
The policy ensures that personal data is retained only for as long as necessary to fulfill the purposes for which it was collected, after which it is securely deleted or anonymized.
- Regular Reviews:
-
Scheduled Audits: CAP Index conducts scheduled audits on a quarterly basis to review data retention practices across all departments. These audits involve checking data storage systems, databases, and records to identify any personal data retained beyond its required period.
-
Automated Monitoring: CAP Index automated monitoring tools to track data retention periods and generate alerts when data approaches its retention limit. This helps ensure timely deletion or archiving of data.
- Data Inventory:
-
Data Mapping: CAP Index maintains an up-to-date data inventory that includes details about the types of personal data collected, the purposes of processing, and the associated retention periods.
-
Lifecycle Management: The data inventory is used to manage the data lifecycle, ensuring that data is retained, archived, or deleted according to the defined retention schedules.
- Deletion and Archiving Procedures:
-
Secure Deletion: Data that is no longer required is securely deleted using methods that ensure it cannot be recovered. This includes using tools that comply with industry standards for data destruction.
-
Archiving: In cases where data needs to be retained for historical or compliance purposes, it is archived in a secure manner, with restricted access and clear labeling to prevent unauthorized use.
- Compliance Checks:
-
Regulatory Compliance: CAP Index ensures that our data retention practices comply with relevant data protection regulations, such as GDPR and CCPA, which mandate specific retention periods for certain types of data.
-
Internal Policies: Our retention assessments also verify adherence to internal policies and contractual obligations related to data retention.
- Employee Training:
-
Awareness Programs: Employees are regularly trained on our data retention policies and the importance of adhering to retention schedules. This includes understanding the risks associated with retaining data longer than necessary.
-
Responsibilities: Specific roles and responsibilities related to data retention are clearly defined, ensuring that employees understand their part in managing data lifecycles.
- Reporting and Documentation:
-
Audit Reports: Detailed reports from retention audits are prepared, documenting any instances of non-compliance and the corrective actions taken.
-
Retention Logs: CAP Index maintains retention logs that record the dates of data deletion or archiving, ensuring a traceable record of compliance with our retention policies.
Backup and Recovery¶
For details on the backup and recovery process, see controls and procedures defined in Data Management.
Monitoring¶
CAP Index uses Azure Application Insights to monitor the entire cloud service operation. If a system failure and alarm is triggered, key personnel are notified by text, chat, and/or email message in order to take appropriate corrective action.
CAP Index uses a security agent to monitor production systems. The agents monitor system activities, generate alerts on suspicious activities and report on vulnerability findings to a centralized management console.
Protecting Data At Rest¶
Encryption of Data at Rest¶
All databases, data stores, and file systems are encrypted with AES-256 using separate keys for each storage type. The keys are rotated periodically.
Local Disk/Volume Encryption¶
Encryption and key management for local disk encryption of on-premise servers and end-user devices follow the defined best practices for Windows, macOS, and Linux/Unix operating systems, such as Bitlocker and FileVault.
Protecting Data In Transit¶
-
All external data transmission is encrypted end-to-end using encryption keys managed by CAP Index. This includes, but is not limited to, cloud infrastructure and third party vendors and applications.
-
Transmission encryption keys and systems that generate keys are protected from unauthorized access. Transmission encryption key materials are protected with access controls, and may only be accessed by privileged accounts.
-
Transmission encryption keys use a minimum of 4096-bit RSA keys, or keys and ciphers of equivalent or higher cryptographic strength (e.g., 256-bit AES session keys in the case of IPSec encryption).
-
Transmission encryption keys are limited to use for one year and then must be regenerated.
-
For all CAP Index APIs, enforcement of authentication, authorization, and auditing is used for all remote systems sending, receiving, or storing data.
-
System logs of Production Data access are kept. These logs must be available for audit.
Encryption of Data in Transit¶
All internet and intranet connections are encrypted and authenticated using TLS 1.2 (a strong protocol), ECDHE_RSA with P-256 (a strong key exchange), and AES_128_GCM (a strong cipher).
Data protection via end-user messaging channels¶
Restricted and sensitive data is not allowed to be sent over electronic end-user messaging channels such as email or chat, unless end-to-end encryption is enabled.
Protecting Data In Use¶
Data in Use, sometimes known as Data in Process, refers to active data being processed by systems and applications which is typically stored in a non-persistent digital state such as in computer random-access memory (RAM), CPU caches, or CPU registers.
Protection of data in use relies on application layer controls and system access controls. See the Production Security / SDLC and Access sections for details.
CAP Index applications implement logical account-level data segregation to protect data in a multi-tenancy deployment. In addition, CAP Index applications may incorporate advanced security features such as Attribute Based Access Control (ABAC) and Role Based Access Control (RBAC) for protection of data in use.
Encryption Key Management¶
CAP Index uses Azure Key Vault for encryption key management.
-
keys are unique to CAP Index environments and services.
-
keys are automatically rotated yearly.
Certificate Management¶
CAP Index uses Azure Key Vaults for certificate management.
-
Certificates are renewed automatically.
-
Security team monitors the certificates for expiration, potential compromise and use/validity. Certificate revocation process is invoked if the certificate is no longer needed or upon discovery of potential compromise.
Data Integrity Protection¶
When appropriate, CAP Index engineering should implement “Versioning” and “Lifecycle”, or equivalent data management mechanism, such that direct edit and delete actions are not allowed on the data to prevent accidental or malicious overwrite. This protects against human errors and cyberattacks such as ransomware.
In Azure, the IAM and data management policy in production will be implemented accordingly when the environments are configured. When changes must be made, a new version is created instead of editing and overwriting existing data.
-
All edits create a new version and old versions are preserved for a period of time defined in the lifecycle policy.
-
Data objects are “marked for deletion” when deleted so that they are recoverable if needed within a period of time defined according to the data retention policy.
-
Data is archived offsite – i.e. to separate Azure account and/or region.
Additionally, all access to sensitive data is authenticated, and audited via logging of the infrastructure, systems and/or application.