Zimbabwe News Update

🇿🇼 Published: 03 January 2026
📘 Source: TimesLIVE

South Africans rightly care deeply about the privacy of children. That concern sits at the heart of the Protection of Personal Information Act (Popia) and must be taken seriously by government, regulators and courts alike. However, privacy protection is weakened — not strengthened — when the concept ofpersonal informationis unduly restricted.

Recent litigation involving the minister of basic education and the information regulator has brought this issue into sharp focus. As ICT and data-governance lawyers practising in South Africa, we are concerned that the court’s reasoning on pseudonymised information [personal information that can no longer be attributed to a specific data subject without the use of additional information]did not sufficiently engage with the real-world context in which matric results are written, known, shared and re-identified. This matters because Popia does not protect information in the abstract.

It protects information where a person is identifiable by reasonably foreseeable means. Context is therefore not optional, it is the test. International courts have recognised this clearly.

📖 Continue Reading
This is a preview of the full article. To read the complete story, click the button below.

Read Full Article on TimesLIVE

AllZimNews aggregates content from various trusted sources to keep you informed.

[paywall]

In the 2025 European Single Resolution Board (SRB) litigation, pseudonymised data was shared with Deloitte for a limited auditing purpose. In determining whether the data was “personal data” in Deloitte’s hands, the court emphasised several decisive contextual factors: the data was disclosed to a single professional recipient; Deloitte had no legal or practical access to the re-identification key; Deloitte was contractually bound not to attempt re-identification; and there was no reasonably foreseeable lawful way for Deloitte to identify the data subjects. In that setting, the court accepted that pseudonymised data could fall outside the scope of “personal data” in the hands of the recipient, Deloitte.

The matric-results context is fundamentally different. Second, contractual and legal constraints. Unlike Deloitte, the general public has no contractual obligation not to re-identify learners.

There is no enforceable legal duty on a learner, parent or friend to “forget” an examination number or to refrain from connecting it to a name. Third, foreseeability of re-identification. Re-identification of matric numbers is not hypothetical or technologically sophisticated; it is designed into the examination process itself.

Learners see their own numbers on scripts. They sit next to classmates. They observe seating arrangements.

They glance at papers while entering or leaving exam halls. Parents are given the numbers. Schools distribute them.

The system assumes — and requires — that the learner and family know the number. To suggest that public re-identification is not reasonably foreseeable in this context is to ignore how matric exams actually work. This is the critical distinction.

In the SRB case, re-identification waslegally and practically remote. In the matric context, re-identification is ordinary, expected and inevitable. The same legal label — “pseudonymised” — cannot produce the same outcome in two radically different factual environments.

Popia itself points us in this direction. It asks whether information can be used, manipulated or linked by a reasonably foreseeable method to identify a person. That inquiry is contextual, not theoretical.

It must consider who receives the information, under what conditions, and with what realistic capabilities. Popia also requires [section 2(b)] that the Information Regulator and courts consider international precedent such as the SRB case above. Our focus is on ensuring that courts and regulators apply Popia coherently and credibly.

If pseudonymised data is treated as non-personal in tightly controlled professional settings, but also treated as non-personal when released to the entire public without adequate safeguards, the law loses internal consistency. Privacy law must be principled to be effective. Context is not a loophole; it is the mechanism by which the law distinguishes safe data use from harmful exposure. In the case of matric results, that contextual analysis deserved closer attention and it was certainly not “fanciful” to argue, as the Information Regulator did, that personal information would be available to someone other than the data subject.

[/paywall]

📰 Article Attribution
Originally published by TimesLIVE • January 03, 2026

Powered by
AllZimNews

By Hope