Nigeria’s Competition and Consumer Protection Tribunal said on Friday that Meta’s appeal against the fine had been unsuccessful.
The Federal Competition and Consumer Protection Commission (FCCPC) levied the fine after an investigation that began after Meta’s WhatsApp updated its privacy policy in May 2021 and concluded in December 2023.
Privacy
The probe, which was conducted along with the Nigeria Data Protection Commission (NDPC), found the privacy policy was imposed on Nigerian users without following standards of fairness.
The commission said Meta had provided documents and retained counsels that met with the agency.
The agency’s final order mandated steps and actions Meta must take to comply with local laws, Abdullahi said.
Nigeria is Africa’s most populous countries and has some 154 million active internet users as of 2022, according to the country’s statistics agency.
The FCCPC said Meta had failed to engage a Data Protection Compliance Organisation and had not filed a Nigeria Data Protection Regulation audit report for two years.
The probe, which was conducted along with the Nigeria Data Protection Commission (NDPC), found the privacy policy was imposed on Nigerian users without following standards of fairness.
The commission said Meta had provided documents and retained counsels that met with the agency.
The agency’s final order mandated steps and actions Meta must take to comply with local laws, Abdullahi said.
Nigeria is Africa’s most populous countries and has some 154 million active internet users as of 2022, according to the country’s statistics agency.
The FCCPC said Meta had failed to engage a Data Protection Compliance Organisation and had not filed a Nigeria Data Protection Regulation audit report for two years.
Data rules
Meta has faced similar charges in other jurisdictions, including the EU, where privacy groups complained about the company’s plans to train its AI systems on users’ data without obtaining consent.
Meta said last year initially withheld the release of its multimodal Llama AI model in the EU due to “unpredictable” regulatory requirements, but began rolling out the feature in the EU last month.
The company’s use of EU data to train its AI models, which began this month, has been challenged by privacy advocates, who say it legally must obtain opt-in consent, rather than only allowing users to opt out.
Meta has faced similar charges in other jurisdictions, including the EU, where privacy groups complained about the company’s plans to train its AI systems on users’ data without obtaining consent.
Meta said last year initially withheld the release of its multimodal Llama AI model in the EU due to “unpredictable” regulatory requirements, but began rolling out the feature in the EU last month.
The company’s use of EU data to train its AI models, which began this month, has been challenged by privacy advocates, who say it legally must obtain opt-in consent, rather than only allowing users to opt out.
By Matthew Broersma, Silicon
No comments:
Post a Comment