Brazil Court Limits Identifiable Data Sharing Without Consent
Summary
Brazil's Superior Court of Justice has ruled that identifiable registration data, such as names and estimated income, cannot be shared with third parties by credit bureaus without explicit consent. This decision clarifies the interpretation of Brazil's General Data Protection Law (LGPD) in the credit market, distinguishing between internal credit risk analysis and external data sharing.
What changed
Brazil's Superior Court of Justice, in case REsp 2.201.694/SP, has issued a landmark decision significantly limiting the scope of 'credit protection' as a legal basis for processing personal data under the LGPD. The court ruled that while internal credit scoring and risk analysis may proceed without explicit consent, the transfer of identifiable registration data (name, taxpayer ID, estimated income) to third parties requires specific consent. This decision distinguishes between aggregated scoring data, regulated credit history, and directly identifiable individual data, rejecting the prior market practice of treating all 'credit data' as equivalent for sharing purposes.
This ruling has substantial implications for credit bureaus, data brokers, and companies operating within Brazil's credit market. Compliance officers must review data sharing agreements and consent mechanisms to ensure they align with the court's interpretation. The decision necessitates a re-evaluation of business models that relied on broad interpretations of the 'credit protection' legal basis for data circulation. Failure to comply with the requirement for explicit consent for identifiable data transfers could lead to legal challenges and potential penalties under the LGPD.
What to do next
- Review data sharing agreements with credit bureaus and third parties for compliance with consent requirements.
- Update internal data processing policies to distinguish between credit scoring and identifiable data transfers.
- Implement explicit consent mechanisms for sharing identifiable registration data in the credit market.
Source document (simplified)
ANALYSIS Published
25 March 2026
Contributors:
Rafael Avellar Centoducatte
CIPM, CDPO/BR
Privacy specialist
Hapvida
Those working in privacy, credit or data governance have seen it before. At some point, it clicks: names, addresses, phone numbers and even estimated income are being shared with third parties — not because of missed payments, but simply because individuals exist as consumers in a credit-driven economy.
This was the background of a case decided by Brazil's Superior Court of Justice in 2025, in REsp 2.201.694/SP. In the lawsuit, a consumer challenged the sharing of his identifiable registration data by a credit information management and provision company, otherwise known as a credit bureau, to third parties without specific consent.
The decision quickly became a landmark in the interpretation of Brazil's General Data Protection Law as applied to the credit market, establishing clearer limits on the sharing of personal data in the absence of consent.
The ruling revives a tension well-known to privacy professionals. How far may data circulate in the name of credit protection before colliding with the data subject's informational self-determination? And, in practical terms, what changes in risk management and regulatory compliance?
Not all credit data is the same
One of the merits of the decision was forcing the market to acknowledge something historically treated as homogeneous. Not all "credit data" is legally equivalent.
On one side lies credit scoring, understood as a statistical model that generates a risk score based on aggregated variables. Brazilian case law, aligned with international practice, has long recognized that scoring may be used without consent, provided principles such as transparency, proportionality and non-discrimination are respected. A score expresses a probability, not an individual's identity.
On another level is credit history, composed of information about past and present payment behavior. In Brazil, this category is regulated under the Positive Credit Registry, which allows for automatic inclusion while preserving relevant rights of the data subject.
The third decisive category consists of identifiable registration data, such as name, taxpayer ID number and estimated income. These are not inferences, but data that directly identifies an individual and shapes their position in the consumer market.
The court's message was clear. Treating these categories as legally equivalent is no longer acceptable.
'Credit protection' as a legal basis, not a blank check
The main defense advanced by the company involved is well-known in the market. The LGPD authorizes the processing of personal data for credit protection purposes, regardless of consent, and for years this provision was interpreted as broad authorization for data sharing throughout the credit ecosystem.
The Superior Court did not invalidate this legal basis, but it significantly narrowed its scope.
According to the majority opinion, there is a legally relevant distinction between processing data internally for risk analysis and making personal data available to third parties, such as consulting firms or market intelligence companies. Credit protection may justify internal processing and model building, but it does not automatically authorize the transfer of identifiable data without consent.
In determining so, the court rejected the notion that this legal basis operates as a generic authorization for the circulation of personal data. Purpose matters, as do the form, context and scope of the sharing.
Although the distinction may appear subtle, its implications are far-reaching. Many business models were built on the assumption that once the purpose was legitimized at the source, subsequent sharing would follow naturally. That assumption now requires revision.
Presumed moral damage and a new risk calculus
Perhaps the most impactful aspect of the decision lies not only in consent, but in how the court addressed civil liability.
It confirmed that the unlawful sharing of identifiable registration data gives rise to presumed moral damages. The data subject need not demonstrate financial loss or denial of credit; the violation of informational self-determination alone is sufficient to characterize harm.
For privacy professionals, this significantly alters the risk landscape. Administrative sanctions under the LGPD were already on the radar, but the decision adds a clear pathway for large-scale litigation, dispensing with individualized proof of damage.
In this context, data sharing governance, legal bases and consent management cease to be mere compliance tools and begin to influence strategic decisions regarding system architecture, contractual design and litigation exposure.
The cost of consent
The decision was not unanimous. The dissenting opinion raised a counterpoint by warning of the economic effects of excessive restrictions on data flows in the credit market.
The argument is familiar. The lower the volume and quality of information available for risk assessment, the greater the uncertainty faced by lenders, typically passed on to the market in the form of higher interest rates, stricter criteria and reduced access to credit, including for so-called "good payers."
From this perspective, broad data availability for credit analysis would reduce adverse selection problems, with positive effects on rates, terms and contractual conditions.
Under this view, requiring consent for data sharing could deepen informational asymmetries and undermine the social function of credit. The dissent did not deny the importance of privacy but questioned whether the adopted solution might produce undesirable systemic effects.
The court acknowledged this tension but made a clear normative choice. Faced with the conflict between economic efficiency and informational self-determination, privacy prevailed. Whether this balance is the most appropriate remains an open debate.
What changes for privacy, credit professionals
For organizations operating within the credit ecosystem, the effects of the decision are immediate. Data flows involving the sharing of identifiable data with third parties must be reassessed. The idea that a legitimate purpose at the outset suffices to justify the entire processing cycle no longer holds.
When consent is required, it cannot be generic or improvised. It must be specific, verifiable and operationally viable. This raises practical questions. Who will be responsible for collecting it, at what point in the data subject's journey, and how will it be audited over time?
These questions become even more pressing when looking backward along data flows. How should shared databases built under assumptions now challenged by the Supreme Court be handled? Will contracts need revision, legal bases revalidated or certain flows discontinued? The decision does not provide direct answers but makes confronting these issues unavoidable.
More than ever, the ruling reinforces that privacy by design is not limited to security measures. It concerns process architecture, the clear definition of roles among actors and the logic governing data circulation over time.
Organizations that treat consent as a mere formal requirement are likely to face growing regulatory and litigation risks. Those that rethink their operational models and governance structures from the outset will be better positioned to adapt.
A turning point, not the final word
The judgment does not close the debate on credit and data protection in Brazil but marks a more mature phase of the discussion. Although grounded in domestic precedent, the decision resonates with global debates on credit scoring, data sharing and accountability in data-driven ecosystems.
What is clear is that the era of treating credit protection as a universal justification for personal data sharing has ended. The challenge now is not to choose between credit and privacy, but to design systems capable of legitimately sustaining both.
For privacy professionals, this challenge is no longer theoretical. It is practical, strategic and unavoidable.
This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.
Contributors:
Rafael Avellar Centoducatte
CIPM, CDPO/BR
Privacy specialist
Hapvida
Tags:
Data security Litigation and case law Law and regulation Finance and banking LGPD Privacy
Related Stories
### ANPD becomes independent regulatory agency: A turning point for Brazilian data protection 30 Sept. 2025
OPINION
### Brazil, EU finalize adequacy agreement 27 Jan. 2026
### How women are leading a human-centered approach to digital governance 25 March 2026
### Understanding the Hiroshima AI Process 25 March 2026
ANALYSIS
Named provisions
Related changes
Source
Classification
Who this affects
Taxonomy
Browse Categories
Get Data Privacy & Cybersecurity alerts
Weekly digest. AI-summarized, no noise.
Free. Unsubscribe anytime.
Get alerts for this source
We'll email you when IAPP Privacy News publishes new changes.