Share icon

Case Studies: High-Profile Cases of Privacy Violation

Contributor.

Smith Gambrell & Russell weblink

Case Studies: Recent FTC Enforcement Actions - High-Profile Cases of Privacy Violation: Uber, Emp Media, Lenovo, Vizio, VTech, LabMD

Uber Technologies

The scenario: In August 2018, the FTC announced an expanded settlement with Uber Technologies for its alleged failure to reasonably secure sensitive data in the cloud, resulting in a data breach of 600,000 names and driver's license numbers, 22 million names and phone numbers, and more than 25 million names and email addresses.

The settlement: The expanded settlement is a result of Uber's failure to disclose a significant data breach that occurred in 2016 while the FTC was conducting its investigation that led to the original settlement. The revised proposed order includes provisions requiring Uber to disclose any future consumer data breaches, submit all reports for third-party audits of Uber's privacy policy and retain reports on unauthorized access to consumer data. 2

Emp Media Inc. (Myex.com)

The scenario: The FTC joined forces with the State of Nevada to address privacy issues arising from the "revenge" pornography website, Myex.com, run by Emp Media Inc. The website allowed individuals to submit intimate photos of the victims, including personal information such as name, address, phone number and social media accounts. If a victim wanted their photos and information removed from the website, the defendants reportedly charged fees of $499 to $2,800 to do so.

The settlement: On June 15, 2018, the enforcement action brought by the FTC led to a shutdown of the website and permanently prohibited the defendants from posting intimate photos and personal information of other individuals without their consent. The defendants were also ordered to pay more than $2 million. 3

Lenovo and Vizio

The scenario: In 2018, FTC enforcement actions led to large settlements with technology manufacturers Lenovo and Vizio. The Lenovo settlement related to allegations the company sold computers in the U.S. with pre-installed software that sent consumer information to third parties without the knowledge of the users. With the New Jersey Office of Attorney General, the FTC also brought an enforcement action against Vizio, a manufacturer of "smart" televisions. Vizio entered into a settlement to resolve allegations it installed software on its televisions to collect consumer data without the knowledge or consent of consumers and sold the data to third parties.

The settlement: Lenovo entered into a consent agreement to resolve the allegations through a decision and order issued by the FTC. The company was ordered to obtain affirmative consent from consumers before running the software on their computers and implement a software security program on preloaded software for the next 20 years. 4 Vizio agreed to pay $2.2 million, delete the collected data, disclose all data collection and sharing practices, obtain express consent from consumers to collect or share their data, and implement a data security program. 5

The scenario: The FTC's action against toy manufacturer VTech was the first time the FTC became involved in a children's privacy and security matter. The settlement: In January 2018, the company entered into a settlement to pay $650,000 to resolve allegations it collected personal information from children without obtaining parental consent, in violation of COPPA. VTech was also required to implement a data security program that is subject to audits for the next 20 years. 6

The scenario: LabMD, a cancer-screening company, was accused by the FTC of failing to reasonably protect consumers' medical information and other personal data. Identity thieves allegedly obtained sensitive data on LabMD consumers due to the company's failure to properly safeguard it. The billing information of 9,000 consumers was also compromised. The settlement: After years of litigation, the case was heard before the U.S. Court of Appeals for the Eleventh Circuit. LabMD argued, in part, that data security falls outside of the FTC's mandate over unfair practices. The Eleventh Circuit issued a decision in June 2018 that, while not stripping the FTC of authority to police data security, did challenge the remedy imposed by the FTC. 7 The court ruled that the cease-and-desist order issued by the FTC against LabMD was unenforceable because the order required the company to implement a data security program that needed to adhere to a standard of "reasonableness" that was too vague. 8

The ruling points to the need for the FTC to provide greater specificity in its cease-and-desist orders about what is required by companies that allegedly fail to safeguard consumer data.

1 15 U.S.C. § 45(a)(1)

2 www.ftc.gov/news-events/press-releases/2018/04/uber-agrees-expanded-settlement-ftc-related-privacy-security

3 www.ftc.gov/system/files/documents/cases/emp_order_granting_default_judgment_6-22-18.pdf

4 www.ftc.gov/news-events/press-releases/2018/01/ftc-gives-final-approval-lenovo-settlement

5 www.ftc.gov/news-events/press-releases/2017/02/vizio-pay-22-million-ftc-state-newjersey-settle-charges-it

6 www.ftc.gov/news-events/press-releases/2018/01/electronic-toy-maker-vtech-settlesftc-allegations-it-violated

7 The United States Court of Appeals for the Third Circuit has rejected this argument. See FTC v. Wyndham Worldwide Corp., 799 F.3d 236, 247-49 (2015).

8 www.media.ca11.uscourts.gov/opinions/pub/files/201616270.pdf

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Photo of Marcia M. Ernst

United States

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

case study data privacy

The Privacy Perspective

Legal blogging on the protection of privacy in the 21st century, top 10 privacy and data protection cases 2022.

Inforrm covered a wide range of data protection and privacy cases in 2022. Following my posts in  2018 ,  2019 ,   2020  and  2021  here is my selection of notable privacy and data protection cases across 2022.

  • ZXC v Bloomberg  [2022] UKSC 5

This was the seminal privacy case of the year, decided by the UK Supreme Court. It was considered whether, in general a person under criminal investigation has, prior to being charged, a reasonable expectation of privacy in respect of information relating to that investigation.

The case concerned ZXC, a regional CEO of a PLC which operated overseas. An article was published concerning the PLC’s operations for which ZXC was responsible. The article’s was almost exclusively focused on the contents of a letter sent to a foreign law enforcement agency by a UK law enforcement agency, which was investigating the PLC’s activities in the region.

ZXC claimed a reasonable expectation of privacy in relation to the fact and details of a criminal investigation into his activities, disclosed by the letter, and that the publication of the article by Bloomberg amounted to a misuse of that private information. He argued that details of the law enforcement’s investigations into him, the fact that it believed that he had committed criminal offences and the evidence that was sought were all private.

At first instance Nicklin J found for the claimant, a finding which was upheld by the Court of Appeal. There were three issues before the UK Supreme Court hearing a further appeal by Bloomberg:

(1) Whether the Court of Appeal was wrong to hold that there is a general rule, applicable in the present case, that a person under criminal investigation has, prior to being charged, a reasonable expectation of privacy in respect of information relating to that investigation.

(2) Whether the Court of Appeal was wrong to hold that, in a case in which a claim for breach of confidence was not pursued, the fact that information published by Bloomberg about a criminal investigation originated from a confidential law enforcement document rendered the information private and/or undermined Bloomberg’s ability to rely on the public interest in its disclosure.

(3) Whether the Court of Appeal was wrong to uphold the findings of Nicklin J that the claimant had a reasonable expectation of privacy in relation to the published information complained of, and that the article 8/10 balancing exercise came down in favour of the claimant.

The Court dismissed the appeal on all three grounds. Therefore the precedent is established that there is, as a legitimate starting point, an assumption that there is a reasonable expectation of privacy in relation to the facts of and facts of a criminal investigation at a pre-charge stage.

There was an  Inforrm case comment  on the case , See also  Panopticon Blog  and  5RB case comment.

  • Driver v CPS  [ 2022] EWHC 2500 (KB )

My second case also concerns law enforcement investigations, this time the passing of a file from the CPS and the disclosure of that fact to a third party. Whilst the disclosure did not include the name of the claimant, it was found that  “personal data can relate to more than one person and does not have to relate exclusively to one data subject, particularly when the group referred to is small.”

In this case, the operation in question, Operation Sheridan, concerned only eight suspects, of which the claimant was one. It should be noted that the claim was one under the Data Protection Act 2018, not the GDPR.

In finding for the claimant on the data protection grounds, but dismissing those for misuse of private information, the Judge made a declaration and awarded £250 damages. It should be noted the “data breach was at the lowest end of the spectrum.”

See  Panopticon Blog on case

  •  AB v Chief Constable of British Transport Police   [2022] EWHC 2740 (KB)

The respondent, an individual with autistic spectrum disorder of the Asperger’s type, claimed that retention of his information by the police in relation to 2011 and 2014 accusations that he touched women inappropriately, was unlawful. The respondent stims, rubbing fabric between his fingers. In both cases no prosecution was brought against AB.

The respondent’s claim was based on the fact the data retained was inaccurate and that its retention was a disproportionate inference with his right to respect for his private life under Article 8 of the European Convention of Human Rights.

In December 2017, Bristol County Council was contacted with safeguarding concerns about AB- in particular, that he was suffering ongoing trauma due to the appellant maintaining ongoing false allegations against him.

As to the claims for inaccuracy “he complained that the records retained by the police inaccurately record that AB put his hands between the legs, and under the dress, of the 2011 complainant. He also implicitly complained that the records of the 2014 incident were inaccurate insofar as they suggested that AB had placed his hand over the complainant’s jeans in the area of her vagina.”

It was found at first instance that the police records were inaccurate, that their retention was a disproportionate interference with AB’s article 8 rights and awarded £15,000 for distress, £15,000 for loss of earnings, and £6,000 for aggravated damages.

It was found that “ the police records in this case are intended to reflect the information that was provided to the police, rather than the underlying facts as to what happened. On this issue I have reached a different conclusion from the judge, with the result that I have concluded that the OSRs are accurate. To this narrow extent, the appeal succeeds. ” [95]

However, the article 8 finding for the claimant was upheld, as was, accordingly, the judge’s declaration that retention was unlawful and the assessment of damages.

  • Chief Constable of Kent Police v Taylor  [2022] EWHC 737 (QB)

A breach of confidence claim relating to a series of videos which the defendant was provided by Berryman’s Lace Mawer LLP (“BLM”). The videos were said to contain sensitive information in relation to a vulnerable minor, KDI, who was the subject of an anonymity order in civil proceedings. The videos themselves were particularly sensitive, relating to police interviews of KDI in relation to criminal allegations against them.

The claimant sued the CC of Kent Police for damage to his front door which occurred in the course of entering his property to search for child pornography. BLM acted for the CC of Kent Police in relation to this matter. During the course of those proceedings that the defendant was given access to the videos, which were for an unrelated claim.

The defendant refused to delete the videos upon request or to explain his dealings with the videos. He instead demanded payment if thousands of pounds for his cooperation with the requests.

The Judge accordingly ordered the defendant disclose matters in relation to his dealing with the videos, to ensure confidentiality has not been breached. A further, unusual, order was granted for independent permanent deletion of the videos- it should be noted the order considered the defendants privacy in the coruse of such an inpdenendent assessment being undertaken with the judge stating “I have built in a safeguard in the order I propose to make to limit the nature of the independent IT expert’s role to protect Mr Taylor’s privacy interests”.

  • Various Claimants v MGN  [2022] EWHC 1222 (Ch)

A case concerning the ongoing phone hacking litigation against Mirror Group Newspapers (“MGN”) in which MGN issued and served applications for summary judgment in 23 individual claims. The judge grouped the claims, with this judgment considering six claimants.

It was considered by the judge whether claimants should have been put on notice at various times up until and following the first primary trial in the scandal on 21 May 2015. The judge found that such matters were not “clear-cut” for the purposes of determining whether summary judgment could be entered into; they were more appropriate to be settled at trial.  There was a comment on the case on the  JMW blog .  On 11 August 2022 Andrews LJ  refused MGN permission to appeal .

  • Brake v Guy  [2022] EWCA Civ 235

The claimants appealed an order dismissing their claim for a final injunction and damages for misuse of private information and breach of confidence. The claim was made in relation to a series of emails sent to and received by the first claimant, Mrs Brake, into a business general enquiries email account. The Court reviewed whether “the judge’s evaluation of the evidence which led him to conclude that they had no reasonable expectation of privacy in respect of the contents of the enquiries account and that the information was not imparted to the Guy Parties in circumstances which gave rise to an obligation of confidence.”

Only two of the 3,149 tranche of emails were produced for the judge to consider- he was, understandably, not inclined to accept that there was a reasonable expectation of privacy in relation to the emails on the basis of those two emails alone. The burden of proof was considered to be “a very substantial hurdle” which the claimants had “fallen well short of surmounting it”.

The arguments for breach of confidence were advanced on the same grounds and dismissed. The judge concluded “the claimants have put forward no argument before this Court which persuades me that the judge was wrong to conclude that the personal information in the enquiries account was not “imparted in circumstances imparting an obligation of confidence.””

The case is instructive as to the method and approach to be taken when claiming there is a reasonable expectation of privacy or obligation of confidence in relation to a high volume of documents. It also provides a tacit reminder of the difficulties over overcoming first instance privacy decisions on appeal. There was  a DLA Piper case comment .

  •  TU and RE v Google LLC  [2022] EUECJ C-460/20

A case concerning two claimants applying for the delisting of search results under Article 17 of the GDPR.

The case is instructive as to the pleading of inaccuracy of data in erasure requests- where it arises and where it does, how such a request should be dealt with:

  • The case states at [72 and 73]:  “where the person who has made a request for de-referencing submits  relevant and sufficient evidence capable of substantiating  his or her request and of establishing the manifest inaccuracy of the information found in the referenced content or, at the very least, of a part – which is not minor in relation to the content as a whole – of that information,  the operator of the search engine is required to accede to that request  for de-referencing. The same applies where the data subject submits a judicial decision made against the publisher of the website, which is based on the finding that information found in the referenced content – which is not minor in relation to that content as a whole – is, at least prima facie, inaccurate” , and
  • “By contrast, where the inaccuracy of such information found in the referenced content is  not obvious, in the light of the evidence provided by the data subject , the operator of the search engine is not required, where there is no such judicial decision, to accede to such a request for de-referencing. Where the information in question is likely to contribute to a debate of public interest, it is appropriate, in the light of all the circumstances of the case, to place particular importance on the right to freedom of expression and of information” .

For further analysis please see the  Panopticon Blog’s excellent analysis of this case .

  • SMO  v TikTok Inc.   [2022] EWHC 489 (QB)

The former Children’s Commissioner of England’s case against Tik Tok for data protection infringements and misuse of private information was discontinued this year. The result was due to the myriad of procedural issues arising in relation to the case including permission to serve out of jurisdiction, extension of time and permission to serve on UK lawyers instead. The case serves as a warning for claimants seeing to issue data protection claims outside of the jurisdiction of ensuring it is done so in proper time and with consideration of matters such as service outside of jurisdiction.

See  Panopticon Blog  on case and on the  discontinuance of the claim .

  • Smith & Other v TalkTalk Telecom Group Plc  [2022] EWHC 1311 (QB)

A claim under the Data Protection Act 1998 and tort of misuse of private information, following a mass data breach. The case concerned three applications:

  • For strike out of the misuse of private information claim and references to unconfirmed breaches in the particulars;
  • For permission to amend the particulars of claim in light of the case  Warren v DSG Retail Ltd  [2021] EWHC 2168 (QB); and
  • An application for further information.

The misuse of private information claim was dismissed. Although the claim had been repleaded to focus on “acts” rather than “omissions” (in an attempt to avoid the consequences of the  Warren  decision), the Judge followed his own decision in  Warren,  holding that the action was, in substance, a claim in negligence and that creating a situation of vulnerability to third party data theft was not a claim in missue of private information.  There was an  Inforrm post on the case  and a two part discussion of the issues  here  and  here . See also the  Panopticon Blog on case .

This case was the final nail in the coffin of mass data breach claims on CFAs supported by ATE insurance (as these are not available in data protection cases).  Unless forming part of group litigation, data breach claims are likely to be transferred to the small claims track (see  Stadler v Currys Group Limited  [2022] EWHC 160 (QB) ).

  • Owsianik v. Equifax Canada Co. ,  2022 ONCA 813

An appeal arising out of three separate class actions in which the plaintiffs sought to apply the tort of inclusion upon seclusion in “data breach” cases.  The Ontario Court of Appeal held that on the facts as pleaded, the defendants did not do anything that could constitute an act of intrusion or invasion into the privacy of the plaintiffs. The intrusions alleged were committed by unknown third-party hackers, acting independently from, and to the detriment of, the interests of the defendants.  The defendants’ alleged fault was their failure to protect the plaintiffs by unknown hackers which could not be transformed into an invasion by the defendants of the plaintiffs’ privacy.

This decision in Ontario is consistent with the approach of the English court in Case No.9.  There were case comments by  Blakes  and  McCarthy Tetrault.

Share this:

3 thoughts on “ top 10 privacy and data protection cases 2022 ”.

  • Pingback: Quotes from caselaw 7: Driver v CPS [2022] EWHC 2500 KB – a departure from the starting point of a reasonable expectation of privacy in criminal investigations pre-charge on “special facts” and low value data breaches – The Privacy
  • Pingback: Remote visual support and data privacy compliance | ViiBE

Leave a comment Cancel reply

' src=

  • Already have a WordPress.com account? Log in now.
  • Subscribe Subscribed
  • Copy shortlink
  • Report this content
  • View post in Reader
  • Manage subscriptions
  • Collapse this bar

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Sensors (Basel)

Logo of sensors

A Case Study on the Development of a Data Privacy Management Solution Based on Patient Information

Arielle verri lucca.

1 Laboratory of Embedded and Distribution Systems, University of Vale do Itajaí, Rua Uruguai 458, C.P. 360, Itajaí 88302-901, Brazil; rb.ilavinu.ude@elleira (A.V.L.); se.lasu@sotsuguasiul (L.A.S.); rb.ilavinu.ude@logirdor (R.L.); [email protected] (L.G.)

Luís Augusto Silva

2 Expert Systems and Applications Lab, Faculty of Science, University of Salamanca, Plaza de los Caídos s/n, 37008 Salamanca, Spain; se.lasu@oamgnezux

Rodrigo Luchtenberg

Leonardo garcez, raúl garcía ovejero.

3 Expert Systems and Applications Lab., E.T.S.I.I of Béjar, University of Salamanca, 37008 Salamanca, Spain; se.lasu@jevoluar

Ivan Miguel Pires

4 Instituto de Telecomunicações, Universidade da Beira Interior, 6200-001 Covilhã, Portugal; tp.ibu.ti@seripmi

5 Computer Science Department, Polytechnic Institute of Viseu, 3504-510 Viseu, Portugal

6 UICISA:E Research Centre, School of Health, Polytechnic Institute of Viseu, 3504-510 Viseu, Portugal

Jorge Luis Victória Barbosa

7 Applied Computing Graduate Program, University of Vale do Rio dos Sinos, Av. Unisinos 950, São Leopoldo, RS 93.022-750, Brazil; rb.sonisinu@asobrabj

Valderi Reis Quietinho Leithardt

8 Departamento de Informática da Universidade da Beira Interior, 6200-001 Covilhã, Portugal

9 COPELABS, Universidade Lusófona de Humanidades e Tecnologias, 1749-024 Lisboa, Portugal

10 VALORIZA, Research Center for Endogenous Resources Valorization, Instituto Politécnico de Portalegre, 7300-555 Portalegre, Portugal

Data on diagnosis of infection in the general population are strategic for different applications in the public and private spheres. Among them, the data related to symptoms and people displacement stand out, mainly considering highly contagious diseases. This data is sensitive and requires data privacy initiatives to enable its large-scale use. The search for population-monitoring strategies aims at social tracking, supporting the surveillance of contagions to respond to the confrontation with Coronavirus 2 (COVID-19). There are several data privacy issues in environments where IoT devices are used for monitoring hospital processes. In this research, we compare works related to the subject of privacy in the health area. To this end, this research proposes a taxonomy to support the requirements necessary to control patient data privacy in a hospital environment. According to the tests and comparisons made between the variables compared, the application obtained results that contribute to the scenarios applied. In this sense, we modeled and implemented an application. By the end, a mobile application was developed to analyze the privacy and security constraints with COVID-19.

1. Introduction

Internet of Things (IoT) devices can be applied in various sectors, acting as a facilitating tool [ 1 ]. Devices may help monitor health conditions without the presence of healthcare professionals [ 2 ]. There are also wireless technologies that monitor older adults and remotely send data such as heart rate and blood pressure to their caregivers [ 3 ]. In addition to monitoring, other devices have auxiliary functions, such as automatic insulin injection devices [ 4 ]. These are directly linked to sensitive patient data and provide additional control in critical situations by, for example, setting the dose to be injected into the insulin pump. Both privacy settings and control information must have an extreme level of security.

For hospital environments, IoT devices are distributed not only for patient use but also for other functionalities. According to Farahani et al. [ 5 ], some of the IoT applications used in hospital settings collect patient data, such as heart rate, blood pressure, or glucose level. As far as the environment is concerned, some sensors detect temperature changes or control the air conditioning; cameras are used to detect intruders and send alerts. In this context, the devices’ scope ranges from patient monitoring to evaluate the environment and the equipment used by health professionals. Thus, the data is recorded from the moment that patients are registered at the reception until they are discharged.

When the patient is registered for admission, basic information is collected and complemented after screening. In a first-aid environment, to ensure all patients’ safety, many hospitals use a screening technique known as the Manchester Protocol [ 6 ]. After screening, the information is added to the patient’s record. Next, the person is given a classification according to their condition; this varies from non-urgent cases to emergency intervention cases. Sensitive information is added to the user record, whose preservation and confidentiality level must be treated as critical. There is information that should not be disclosed or related to the patient, as is the case with a patient suspected of having viral and infectious diseases.

The current pandemic of Severe Acute Respiratory Syndrome Coronavirus 2 (COVID-19 SARS-CoV-2) causes the patient to be identified as a possible carrier even during the screening process, based on certain symptoms. According to Rothan and Siddappa [ 7 ], those infected usually show symptoms after approximately five days, the most common signs of illness being fever, cough, and fatigue; the patient may also present headaches, phlegm, hemoptysis, diarrhea, shortness of breath, and lymphopenia. These symptoms are identifiable without specific examinations that are directly documented in the patient’s medical record. Liang et al. [ 8 ] mention that for most patients diagnosed with COVID-19, 85.7% had fever, 42.9% had cough, 33.3% had expectoration, 57.1% had fatigue and 38.1% had headache and dizziness. For this reason, one can see that fever is a common symptom. Thus, this condition must be checked as soon as the patient is admitted to the hospital. Due to COVID-19’s high rate of contagion, the patient’s referral to medical care and subsequent isolation should be done quickly and strictly in confirmation.

When it is confirmed that the patient has a COVID-19 infection, this information is directly linked to their record, which should remain confidential. Soares and Dall’Agnol [ 9 ] comment that privacy is considered an individual right that includes the protection of the intimacy of the subjects, respect for dignity, limitation of access to the body, intimate objects, family and social relationships. In addition, in this same bias, the concern also covers the complete information collected during the patient care process. Even though patients’ data must be confident among all parties in general, due to the current pandemic situation and contagion rate, an extra precaution must be taken to join the statistics without having their information revealed. The application of privacy on patient data must be given to all levels with access to any information, be it registration, device, or image.

The main purpose of this work is to apply privacy constraints in patients with suspected COVID-19. The basis for the application of privacy is the same for patients in general, but using as a basis the fact that it is a pandemic situation, and the discretion in handling data of a suspected patient is crucial. Also, as it is a highly contagious virus, the process from admission to the emergency room to the patient’s referral must be done quickly. In this way, a taxonomy was proposed that covers four topics and five subtopics regarding the entities/environments participating in the hospital admission process.

The scientific contribution of this paper is a system to support the privacy constraints related to COVID-19. It started with the study of the state-of-the-art in hospital environment. Next, we defined a taxonomy, and a mobile application was implemented to test and validate the use of the mobile application to cover the privacy constraints defined in the taxonomy.

The main results of this study are related to the identification of the users. Cryptography methods were implemented control the users according to the diagnosis of COVID-19. As these data are related to health, it must be secure and anonymous. The data collected included reliable data related to temperature parameters for the detection of the symptoms, such as fever.

For a better understanding of the matter and a clearer overview of the relevant details, this work is organized as follows: Section 2 lists the related works; Section 3 describes the taxonomic definition developed for this project and the attributes of the user parameter, environment, privacy, and device; Section 4 demonstrates the modeling of the project, including the use cases, sequence and context diagrams; in Section 5 , we present the prototype with the application developed to be validated. Section 6 presents experiments and results. Finally, in Section 7 , we conclude and discuss the future work.

2. Related Work

Studies on the application of privacy in hospital settings cover different aspects. Various studies were selected to identify privacy targeting, including encryption, profile privacy, device privacy, and taxonomic definitions. The focus among the related papers vary from studies on security over mobile application to systems conceived to protect user privacy.

Barket et al. [ 10 ] present a broad study on the context of privacy, developing a taxonomy meant to connect privacy and technology based on the following aspects: purpose, visibility, and granularity. According to the authors, the aim is related to why the information is requested; depending on the cause, more or fewer details about the user are passed on. Visibility refers to who is allowed to access the user data. Granularity designates the data transfer required for the type of access and purpose for that particular request.

The work of Asaddok et al. [ 11 ] involves mobile devices in the area of health (Mobile Health (mHealth) and the parameters: usability, security, and privacy. The authors propose a taxonomy that involves the three parameters mentioned, and, for each, it branches into taxonomies. One taxonomy is defined by usability, effectiveness, efficiency, satisfaction, and learning. Next, for security, confidentiality, integrity, and availability is restricted to another taxonomy. Finally, for privacy, identity, access, and disclosure, the last taxonomy is defined.

Coen-Porisini et al. [ 12 ] describe a conceptual model for defining privacy policies that cover the user, the user’s profile, the information, and the action that will be taken by a third party to request the information. The authors revealed the link between the three topics mentioned in a Unified Modeling Language (UML) format. The user is divided into personnel—the person to whom the data is referred; processor—the person who will request the data; controller—the person who controls the actions requested by the processor. Data is divided into: identifiable—in situations when it is clear who the data refers to, such as the name; sensitive—it refers to information, processing, and purpose. We can also observe that there is an interaction between the medical user and the controller, along with the processes of access (processing), treatment (purpose), and communication (obligation). The diagram demonstrates how information is delivered to the medical user through requests, based on their access profile.

Silva et al. [ 13 ] use a notification management system focused on user privacy in this context. It contributed to the development of an application that can handle different types of notifications. Moreover, the network made it possible for those involved to ensure that the messages sent and received followed the rules defined earlier. If applied to health notifications or to alert cases of COVID-19, this is a strategic tool, addressing messages with defined priorities while also linking privacy in the traffic sent. Therefore, this work contributes to finding a link between IoT requirements and definitions. In [ 14 ], the authors implemented a system for monitoring and profiling based on data privacy in IoT. From the results obtained in the tests, they identified different profiles assigned to random situations. In this case, the health system user’s profile priorities would apply and determine which profiles would be authorized to receive data. In this work, it was also possible to address the evolution and reduction of the hierarchy based on factors that identify users’ frequency in the environments tested.

Concerning the relationship between data privacy and its use in situations such as the COVID-19 crisis, Zwitter et al. [ 15 ] deals with the basic concept of human rights that relates data privacy with the need to use certain information, such as someone’s location. The authors mention features of applications developed by China, South Korea, and the United States that use tracking techniques to indicate close contact with virus carriers or identify specific individuals or groups’ movements. The study concludes that location data is important in the fight against the spread of the virus, but other relevant information, such as genetic data, should be considered. It is necessary to use this information correctly, as stipulated by the law. It also states that data sensitivity classification is contextual; data protection and privacy are important and must be maintained even in crisis. Information leaks are inevitable, so organizations should always protect themselves; ethics in data manipulation is mandatory for more efficient analysis.

Yesmin et al. [ 16 ] deal with the privacy of patients’ data in terms of the interoperability of systems and the employees’ access to information. Also, they tell us that there is no framework for evaluating privacy audit tools in hospitals yet. The application of a framework would help identify any trend in accessing the data and allow the hospital to improve its performance in detecting possible data leaks. According to the authors, the literature reveals that the most significant leakage of information occurs through employees (nurses, doctors, sellers, and others). An evaluation framework was then developed and tested using the black box concept, which uses usability testing information. The following must be monitored through machine learning or artificial intelligence tools: employee access to information, validation of entry and non-standard behavior, and unexplained access to files.

The work of Islam et al. [ 17 ] deal with a survey on the application of IoT devices in the health system. The authors deal with the IoT network’s topology for health, which facilitates the transmission and reception of medical data and enables data transmission on demand. They also mention features of wearable devices, which capture and store patient data. These may include blood sugar levels, cardiac monitoring, body temperature, and oxygen saturation. The authors explain that the security requirements applied to healthcare IoT equipment are similar to those of other communication scenarios. Therefore, the following must be considered: confidentiality, integrity, authentication, availability, data update, non-denial, authorization, resilience, fault tolerance, and fault recovery.

Sun et al. [ 18 ] designed the HCPP (Healthcare System for Patient Privacy) system to protect privacy and enable patient care in emergency cases. The entities defined for the system are the patient, the doctor, the data server, the family, the personal device, and the authentication server. According to the authors, the system meets the following security criteria: privacy, data preservation by backup, access control, accountability, data integrity, confidentiality, and availability.

Samaila et al. [ 19 ] developed a survey in which information was collected regarding work on security and privacy in IoT in general. The study’s scope ranges from security, encryption, communication protocols, authentication to privacy, among others. The authors also collected information on applications, reliability, and other technical issues, combining ten related works. Additionally, the authors claim that the work covers a system model, a threat model, protocols and technologies, and security requirements. The work discusses the IoT architecture considering nine application domains: home automation, energy, developed urban areas, transport, health, manufacturing, supply chain, wearables, and agriculture. Security measures and system and threat models were defined for each application domain, including protocols and communications. The security properties covered were confidentiality, integrity, availability, authenticity, authorization, non-repudiation, accountability, reliability, privacy, and physical security. These also describe mechanisms that can be applied to achieve the desired security requirements: authentication, access control, encryption, secure boot, security updates, backup, physical security of the environment, and device tampering detection.

Plachkinova, Andrés and Chatterjee [ 20 ] elaborated a taxonomy focused on privacy over mHealth apps. Downloadable apps through the app store do not have a unified way to provide terms of use or privacy policies for the user. Apps mostly communicate between patients and doctors, access to patient medical records, self-diagnosis based on symptoms, etc. The management of user data after the app is installed may not be precise. The authors elaborated a taxonomy that embraces the following three dimensions: mHealth app (patient care and monitoring; health apps for the layperson; communication, education and research; physician or student reference apps), mHealth security (authentication; authorization; accountability; integrity; availability; ease of use; confidentiality; management; physical security) and mHealth privacy (identity threats; access threats; disclosure threats).

Alsubaei, Abuhussein, and Shiva [ 21 ] proposed a taxonomy aiming to enhance security among IoT medical devices, as it has life-threatening risks when a device is not secure. According to the authors, since security and privacy are becoming challenging due to the sensitivity of data in healthcare, it is crucial to enhance these measures. The taxonomy is based on the following topics: IoT layer, intruders, compromise level, attack impact, attack method, CIA compromise, attack origin, attack level, and attack difficulty. For each topic, some subsections embrace items from that topic. Since new attacks are always being created, this taxonomy can be updated, according to the authors. The related works we have selected cover the topics that we cited as critical to privacy. Some applied cryptography in the study as a reference of types of attacks, and others used cryptography to prevent data from being accessed from third parties. Most of them applied user profile privacy to prevent any unauthorized access or mitigate when it happens.

Data encryption is necessary so that in the event of an attack, a third party cannot gain access to information [ 22 ]. Cryptography is part, directly, from [ 17 , 18 ]. Islam et al. [ 17 ] mentioned cryptography among security threats, where cryptographic keys can be stolen to collect user sensitive data. The work of Sun et al. [ 18 ] mentioned encryption as a way to protect health information and applied identity-based cryptography for encryption, authentication, and deriving shared keys for their Healthcare system for Patient Privacy (HCPP) protocols. Also, they made use of searchable symmetric encryption to return encrypted documents to the owner.

The application of private profile was mentioned in all works, except by [ 21 ]. The user’s profile privacy serves to protect any information from being used by third parties [ 23 ]. A security layer should be applied at the device level to prevent third parties from accessing information or even gaining control of it [ 24 ]. The work of Alsubaei, Shiva, and Abuhussein [ 21 ] mentions about attacks that influences on Confidentiality, Integrity and Availability (CIA) triad, which is a basic thread on privacy, but does not explore ways to protect user privacy concerning data access based on authorization. Barker et al. [ 10 ] are concerned about private profile through who can access the data and which data can be accessed, based on the purpose of this access request.

Asaddok and Ghazali [ 11 ] defined data access based on access to patient identity information, personal health information, and personal health records, moreover defined in their taxonomy as identity, access, and disclosure. Coen-Porisini et al. [ 12 ] say that data access must be based on access control based on the users and their roles. Thus, data access must be granted based on a consent given by the patient. Silva et al. [ 13 ] defined their privacy requirements based on the user permissions, environment, and hierarchy. Leithardt et al. [ 14 ] proposed a middleware in which the user’s permission can be changed due to the environment and the frequency in which the user frequent it. This way, the given information will vary based on this environment, and the rules of its context.

Zwitter and Gstrein [ 15 ] say that data collection and its use must be done concerning the principle of proportionality and individual’s interests. Their work is based on data collected over the individual’s location and genetic data. Thus, the authors exposed user data principles as: sensitivity, privacy and protection, breaches precaution, ethics. The study of Yesmin and Carter [ 16 ] was concerned about the patient data through authorized and unauthorized access. The authors developed a framework that audits this access, although the study was limited as real patient information could not validate the tool. Instead, they used real data and could evaluate the amount of unauthorized/unexplained accesses to the patient’s data.

Islam et al. [ 17 ] treated data with CIA triad, so that confidentiality is related to the medical information and its protection against unauthorized users. Their study gathered information on various aspects related to the use of IoT devices in medical care. Thus, they say that policies and security measures must be introduced for data protection when sharing data with users, organizations, and applications. Sun et al. [ 18 ] combined cryptography with user privacy and their trust relationship with entities, such as family members, physicians, or his device. Thus, these entities are allowed to access the patient’s protected health information. In Plachkinova, Andrés, and Chatterjee [ 20 ], the authors studied mHealth apps and the concern about the use of information, terms of use, and privacy policies. The authors mentioned that it is not clear how the data is managed, neither who gets access to it. They developed a taxonomy in which user data is part of the identity threats, access threats, and disclosure threats.

The concern for privacy regarding the device was found in most papers. In Alsubaei, Shiva, and Abuhussein [ 21 ], the IoT device is part of the proposed security taxonomy. As their work concerns about mHealth devices, it is part of the proposed taxonomy’s wearable devices, which embraces numerous sensors. The authors describe potential attacks for these devices, as side-channel, tag cloning, tampering devices, and sensor tracking. In the work of Asaddok and Ghazali [ 11 ], the authors classified mobile devices as part of the application dimension of the taxonomy, present in the topic ’patient care and monitoring’, as they are used for observation of the patient.

The work of Silva et al. [ 13 ] applies privacy over mobile devices regarding aspects such as the environment. Thus, privacy used on mobile devices is part of their taxonomy and a base point of their study. Leithardt et al. [ 14 ] are guided on device privacy. This topic is the central part of their work. Zwitter and Gstrein [ 15 ] mention mobile devices, although their concern focuses on apps and location data, not the device itself. Islam et al. [ 17 ] treat devices like mobile, connected to the Internet through IoT providers. Thus, they are vulnerable to security attacks, which may originate within or outside the network. The authors mention that IoT health devices are part of an attack taxonomy, including information, host, and network. Sun et al. [ 18 ] define the Private Device (P-device) as an entity involved in the HCPP system, such as smartphones or wearable devices. The patient uses the P-device to manage privileges on access to his health data. In Plachkinova, Andrés, and Chatterjee [ 20 ], the device must be secured, as it can leak data about the location or sensor of the patient. As the apps mentioned in their work fail to provide accurate data management information, the device can be a tool for misusing information.

The use of the data acquired from different sensors needs the implementation of several privacy and security rules. In [ 25 ] is presented a low-cost system that embeds the measurement of temperature, heart rate, respiration rate and other parameters to define the health state of the person. This system performs the networking with the healthcare professional to prevent several situations. In addition to these sensors’ data, it includes the tracking of the location of the user to present several contagious. This system may be used for a preliminary diagnosis. Mobile devices are capable of acquiring different types of data in several conditions. Spain was one of the fustigated countries with this pandemic’s situation, and the authors of [ 26 ] proposed the implementation of online sensing networks to provide social quarantine and reduce the contagious with the virus.

The monitoring of the COVID-19 needs the use of secured technologies, and the IEEE 802.11ah technology was used in [ 27 ] to support the prevention of the contamination with COVID-19. It can be implemented in telemonitoring technologies to provide reliable information and prevent the contact. The network should previously know which are the persons that are contaminated with the virus. The tacking of the location and movements may be performed with location, inertial, and proximity sensors that communicates the data to social networks to reduce the social contact with infected individuals. The authors of [ 28 ] studied different privacy constraints related to the real-time monitoring with the mobile devices. The monitoring with mobile devices can be considered to be a digital vaccine that help in the reducing number of contagious with massive sharing of the data.

The creation of a taxonomy was proposed by most of the related works. Alsubaei, Shiva, and Abuhussein [ 21 ] proposed a taxonomy regarding IoT layer, intruder type, compromise level, impact, attack method, CIA compromise, attack origin, attack level, and attack difficulty. As can be seen, the taxonomy embraces the security and privacy aspects of medical IoT devices. Barker et al. [ 10 ] explored three dimensions to develop a taxonomy, based on visibility, granularity, and purpose. These three dimensions focus on privacy aspects, where visibility deals with who is permitted to access the data. Granularity is focused on the characteristics of that data to direct it to the appropriate use and a dimension that deals with the data’s purpose. In the work of Asaddok and Ghazali [ 11 ], the authors developed a taxonomy containing usability, security, and privacy aspects to mHealth applications. Each item of the taxonomy is derived in three or more sub-items. Silva et al. [ 13 ] developed a taxonomy or notifications on mobile devices, including communication protocols, message transmission technologies, privacy, and criteria. Plachkinova et al. [ 20 ] proposed a taxonomy for mHealth apps regarding security and privacy. The items involve app dimension, security dimension, and privacy dimension.

Table 1 presents a comparison with the related works concerning the application of the privacy aspects described above with the additional taxonomy application.

Scope of related works

WorkCryptographyPrivate ProfileDevicesTaxonomy
[ ] (2007)
[ ] (2009)
[ ] (2015)
[ ] (2015)
[ ] (2015)
[ ] (2017)
[ ] (2017)
[ ] (2019)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
Proposal

Even though cryptography is one of the main concerns when dealing with data privacy, descriptions of how to apply it were found explicitly only in the works of Islam et al. [ 17 ], Sun et al. [ 18 ] and Riza and Gunawan [ 27 ]. As Horst Feistel [ 29 ] said almost 50 years ago: “personal data needs protection, which can be achieved when enciphering the material”. Cryptography will prevent the plaintext from being accessible to people who are not authorized to have it, whereas it is an important tool when dealing with personal data. The work of Islam et al. [ 17 ] comprises a survey of IoT in health care, including analysis regarding the security and privacy aspects. However, the authors did not expose how cryptography can be applied, instead, mentioned that some parts of the flow can be tampered by attackers to obtain the cryptographic secrets. This way, IoT systems should be designed with protections against stealing of cryptographic keys.

The work of Sun et al. [ 18 ] is focused on cryptography, as it describes a system based on this aspect. The authors designed protocols for a healthcare system in which the security aspect leverages on cryptographic tools. The HCPP allows the patient to store their medical record even on public servers, where only the patient can retrieve the information. The patient’s medical record is encrypted to ensure privacy, and its content can only be retrieved by the patient and the physician when some treatment is being carried out. If by any means the patient is unable to retrieve the medical record, the system can provide the relevant information to the physician without compromising the patient’s secret key. In our work, cryptography is used to prevent unauthorized access to the patient’s medical records. As it can be seen in our proposed taxonomy in Figure 1 , cryptography is part of the User’s items, as it is a critical tool to protect the patient data. The patients’ medical records should be stored and transmitted in encrypted ways, in a way that only the personnel who has the authorization and, therefore, the secret keys, can decrypt the data. Therefore, patients’ medical records are encrypted and can only be accessed by the authorized staff.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g001.jpg

Proposed taxonomy.

In comparison to the selected works, ours stands out because it includes the indication of encryption, profile privacy, concerns on device, and the definition of the taxonomy meant to define the theme and scenario of the application more clearly. Our taxonomic definition aims to embrace the necessary aspects to be covered to enhance security measures throughout the patient’s sensitive data. We developed a mobile application to validate the data flow of information from the moment patients are being admitted in the hospital until they are discharged.

The use of a mobile application that implements data privacy parameters related to the data of patients infected with COVID-19 is another contribution of this study. The data of patients may be its location, temperature, history of navigation, among others. Therefore, we consider that contagion can be identified in the first moments spent in the emergency room using basic information on the health status and the monitoring of the feverish state with the use of IoT devices. The degree of privacy applied in each user’s registration process should enable identifying infected patients without the exposure of sensitive data.

To this end, we have developed a taxonomy that highlights how important it is for confidential information to be handled with care. We have included examples of privacy applications in the use of IoT devices to receive, screen, and providing patient care with a focus on the COVID-19 pandemic.

3. Taxonomy

We have developed a taxonomic definition for a better classification of the items related to the privacy parameters. A taxonomy is necessary to identify the critical aspects where security measures and policies need to be applied. Based on the goals of this paper and the comparisons made with the related works, we selected the principal parameters to manage privacy, which are divided into other levels to better embrace the desired security aspects.

As presented in works [ 13 , 30 , 31 ], a taxonomy allows the systematic organization of relevant data in the form of a hierarchy. The keywords and concepts used to define a taxonomy establish parameters throughout the information production cycle, in which distributed professionals can participate in the knowledge creation process in an organized way. This definition covers four parameters for managing privacy standards in hospital settings within the previously defined context. The selected parameters with five attributes were considered necessary for this scenario. Figure 1 shows the taxonomic definitions proposed in this paper.

3.1. User Parameter

The user parameter designates the person who provides, controls, or operates the sensitive data used in privacy handling. This parameter refers not only to the patient but also to the participants in the data’s provision or control. For this parameter, we set the following attributes: profile, collaborative, hierarchy, cryptography, data. The profile attribute covers several items that will be part of the process. According to Fengou et al. [ 32 ], six entities participate in interactions taking place in the hospital environment:

  • the patient himself/herself;
  • the clinical network that will care for the patient, including doctors, family members, volunteers, health insurance provider, among other things;
  • the hospital;
  • smart home as an environment with ubiquitous equipment’s capable of providing security and quality of life;
  • the environment in which the patient works, the vehicle with which the patient is transferred to the clinical center.

Based on the entities listed, it can be observed that the user profile is one that must be substantiated, along with the profiles of other entities. The patient’s cooperativeness in providing their registration data is fundamental for a better experience in the given setting. According to Leithardt [ 33 ], the user must provide access to their information and services, thus favoring both their expertise in using the service and the system’s improvement whole. The hierarchy enables proper separation of the levels and permissions of each user type. Viswanatham and Senthilkumar [ 34 ] proposed the so-called hierarchy-based user privacy, where the information is encrypted and decrypted based on access levels and releases.

The General Data Protection Regulation (GDPR) deals with the need to protect confidential data and the inevitable risk of data theft. Encryption reinforces that all sensitive information must be covered by an acceptable security level, either at its source or at its destination. Ibraimi et al. [ 35 ] said that patient confidentiality is one of the significant obstacles in obtaining medical data, as some information is not shared for fear of it being saved in databases that do not comply with security regulations. The protection of sensitive patient information is an essential task. The Department of Health and Human Services, 2002 (HIPAA) Privacy Standard [ 36 ] deals with the security of sensitive patient information in the medical field. It is a US federal law created in 1996 to impose standards for protecting such information and preventing it from being shared without the patient’s consent. Cooper et al. [ 37 ] deal with privacy and security in data mining in the medical field and cites HIPAA in information privacy matters. In 2002, they suggested that protective measures be imposed by health plans, clinical centers, and other entities involved.

3.2. Environment Parameter

The environment parameter represents the smart physical location where user data will flow between different systems and devices. For this parameter, we define the following attributes: topology, interoperability, policies, risks, hierarchy. Topology refers to the architecture of a hospital environment. Costa [ 38 ] comments that hospitals used to be built with an emphasis on the utility of the building and the technique used. The health field’s processes and dynamics are often determined by how the wards, sectors, and departments that house distinct functions are arranged. In many of the methods that occur during the patient’s journey through the emergency room, one or more systems are used.

Interoperability between systems is strongly present in the medical field presently. According to Lopes [ 39 ], strategies used to be designed and developed from an internal perspective of organizations, with no motivation for integration with other systems. In all the smart environments that people transit, data is shared between information systems and IoT devices. The data are a vital part of the operation of a health institution. Several policies need to be established to apply access security to these environments and define what data will be exchanged between systems and devices. According to Yildirim et al. [ 40 ], information security management is an activity that aims to implement a set of policies that help to define an acceptable level of security in these environments, minimizing the potential risks inherent in the exploitation of this information.

Risk management in hospital settings is a crucial activity for the proper functioning of the operation. According to Florence et al. [ 41 ], the risk is an estimated value that considers the probability of occurrence of damage and the severity of said damage. Therefore, procedures are meant to minimize those factors that need to be mapped, controlled, and defined. The dimension in the patient’s care is large and complex. It occurs at various times and in multiple environments in the course of service, along with several interactions between the patient, other participants, and technologies. Soares et al. [ 9 ] emphasize that due to its characteristics and complexity, the hospital environment favors establishing power and asymmetrical relationships between the nursing team and patients. The asymmetry results from the patients’ fragility and vulnerability in the face of health-diseases processes.

3.3. Privacy Parameter

The privacy parameter designates how each piece of information will be handled according to its characteristics. For this parameter, we define the following attributes: communication, applicability, controller, consent, operator. The transmission is linked to the type of user profile and will usually involve unsafe transmitting the information. According to Machado [ 42 ], anonymization or encryption in particular pass through the means of communication, i.e., the very existence of communication drives the need to apply security measures to data. It is a basic human right to have one’s sensitive data handled with care. Thus, its applicability is significant. The General Data Protection Law (LGPD) [ 43 ], as the Brazilian Data Protection Law, aims to apply standards and laws regulating and protecting individuals’ data. Without this application of standards and regulations, sensitive information could easily be used by those who should not have access to it in the first place.

A categorization determines who has the authority to decide the type of treatment that personal data will be submitted. As mentioned in the LGPD [ 43 ], the controller must obtain the consent of the individual owner or holder of the concerned data. The user may, in turn, deny or grant access to their information by a third party. The user must give their consent, a manifestation by which they agree that their information be used in a specific way for a particular purpose. As mentioned in the LGPD [ 43 ], if the controller wishes to use this data at another time, consent will be requested once more. The operator shall be responsible for carrying out the data processing determined by the controller. As mentioned in the LGPD [ 43 ], the operator is jointly and severally liable for the damages caused by data handling if the strategy does not comply with legal provisions or is not in line with the controller’s instructions. The user provides their consent, and the operator is responsible for processing the information made available when for personal use or transfer to third parties.

3.4. Device

The device represents either the IoT equipment present in the smart environment that will interact with the patient’s data or the wearable IoT device that will be set to monitor the patient’s temperature. There may be devices that are fixed in the environment, such as surveillance cameras or devices that can be used to monitor the patient, which can be fixed or mobile. For this parameter, we define the following attributes: function, location, communication, accessibility, interactivity. The device must meet the needs of the process to which it will be directed. According to Lupiana and O’Driscolle Mtenzi [ 44 ], one of the relevant requirements for devices is their storage and processing capacity. The location attribute refers to the location where the device is installed. For Leithardt [ 14 ], the attribute that controls location must be linked to a database where all user data must be included. This database will be accessible only for updating and validating some data. The other information should be processed from the point where the user has accessed the system to provide greater security and reliability. Figure 2 shows both fixed and wearable IoT devices and how the parameters are applied. For both fixed and wearable IoT devices, all five parameters are used. The last column on the Figure 2 shows some of the possible options for each attribute.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g002.jpg

Devices with its parameters.

The way the device communicates with the user is addressed through the communication attribute and fits in heterogeneity, a feature that ensures information is handled evenly. According to a study presented by Pradilla, Esteve, and Palau [ 45 ], the devices are responsible for taking data acquisition through sensors, supporting data treatment with processing units, and acting in conjunction with IoT. Therefore, it is necessary to use heterogeneity in the communication protocols handled by the device and the number of services and types available. This attribute is associated with the protocols of the device, providing security in data transfer. The possibility to access the device whenever necessary is crucial, and interactivity between the device and the client must be ensured. With this in mind, we have developed a model based on the characteristics and functionalities defined in the described taxonomy.

4. Project Modeling

The model consists of use case diagrams, sequence diagrams, and context diagrams. All these notations are based on UML. The described model refers to the process from the patient’s arrival at the hospital until his discharge.

4.1. Use Cases Diagrams

The first use case represents the entry of a patient into the emergency room. The patient interacts with the receptionist and performs some procedures. This use case includes some of the attributes of the proposed taxonomic definition: privacy, represented by the data which the patient grants access to and is registered in the systems; user, represented by the patient and the receptionist; environment, represented by the emergency room, shown in Figure 3 .

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g003.jpg

Reception at the Emergency Room.

After first care and registration, the transfer of the patient to the screening area is demonstrated in the use case pictured in Figure 4 . The screening process aims to establish the urgency of the case and the risk classification. This use case includes some of the attributes present in the proposed taxonomic definition: privacy, represented by the data which the patient grants access to and is registered in the systems and the wearable IoT device; user, represented by the patient and the nurse; environment, represented by the screening room; device, represented by the wearable IoT device that will receive an identification to record the data and the classification of this patient.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g004.jpg

Screening Room.

And the last use case represents the patient being attended to by the doctor in the office after going through the screening process. The wearable IoT device identifies the patient so that the data is made available, and the doctor proceeds with the consultation. The doctor performs the anamnesis and records the data in the Electronic Health Record (EHR). This use case uses some of the attributes of our taxonomy as follows: privacy, represented by the data which the patient grants access to and is registered in the systems; user, represented by the patient and the doctor; environment, represented by the office; device, represented by the wearable IoT device used by the patient. This case is illustrated in Figure 5 .

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g005.jpg

Reception at the Office.

Sequence diagrams of each use case were also developed. Sequence Diagram is a UML tool used to represent interactions between objects in a scenario, performed through operations or methods.

4.2. Sequence Diagrams

The sequence diagram displayed in Figure 6 represents the entry of a patient into the emergency room. It demonstrates the arrival of the patient (user) to the emergency room (environment), where they request assistance from the receptionist (user). The receptionist provides a password to the patient waiting to be called on. Upon being called on, the patient offers data for registration updates (privacy) recorded by the receptionist in the hospital system. The receptionist checks if the patient has a health plan and then records how this service’s billing issue will be managed. After this procedure, the patient will be referred to as screening.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g006.jpg

Sequence—Reception at the Emergency Room.

The sequence diagram displayed in Figure 7 represents the patient’s entry into the screening room after completing the first stage in the emergency room. It means the arrival of the patient (user) to the screening room (environment), where they will convey their data as requested by the nurse (user). The nurse records the hospital system’s data and the entry into the system that configures the wearable IoT device that will monitor the patient (device). The receptionist then hands the wearable IoT device over to the patient and starts the assessment. The patient answers the questions (privacy), and the nurse records all the information in the hospital system. All patient data is in the system, and the hospital from their wearable IoT device can track it.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g007.jpg

Sequence—Screening Room.

The sequence diagram displayed in Figure 8 shows the patient’s entry into the office after going through the screening process. It represents the arrival of the patient (user) to the office (environment), where they will convey their identification data as requested by the doctor (user). The latter records the electronic record data and refers to the patient’s wearable IoT device in the hospital system. The doctor performs the anamnesis on the patient, who must answer the questions (privacy). The doctor also records this information in the patient’s electronic record. The patient has already been attended to, so they are drugged and released or referred to another hospital ward based on the clinical condition’s evolution.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g008.jpg

Sequence—Office.

4.3. Context Diagrams

The Context Diagram is a UML tool that represents the entire system as a single process. It consists of data streams that show the interfaces between the system and external entities [ 46 ]. The diagram illustrates the object of the study, the project, and its relationship to the environment. Figure 9 represents the context diagram of this project.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g009.jpg

Context Diagram.

The patient (user) requests assistance from the receptionist (user), who will fill in the data (privacy) in the hospital system. The hospital system interacts with the operator’s system that is outside the physical environment of the hospital. In the screening process, the nurse (user) conducts the questionnaire with the patient (user), entering the basic health data in the central hospital system, which interacts with the wearable IoT devices system. Finally, the doctor (user) performs the anamnesis, entering the central hospital system’s consultation information. These information registration processes are focused on privacy determinations, and all processes occur in a clinical setting, explicitly represented within the context diagram.

5. Prototype

A mobile application was developed as a prototype to illustrate the basic principles, from the admission of the patient to the emergency room and referral to the office or discharge indication. The goal of the developed mobile application is to validate some taxonomy items, for it embraces the environment (interoperability among the system and the wearable IoT device). The application was developed using NodeJS.

The application comprises an initial customer registration screen, which simulates the process of filling out the registration form upon admission to the emergency room. The prototype only contains the primary fields: name, gender, age, and address. The ’encrypt data?’ checkbox has been included to select the encryption/hash algorithm. Since it is merely a prototype for demonstrating the flow of information and its security application, the hashes SHA-256 and SHA-512 were made available. In the real application, they would not serve to encrypt data because hashes are not reversible and are considered a one-way function [ 47 ]; the prototype also includes the Advanced Encryption Standard (AES) symmetric encryption algorithm. Figure 10 illustrates the first registration screen of the application with the fields mentioned above.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g010.jpg

Application Flow.

As shown in Figure 10 , the application’s flow is as follows: initially, the patient fills out a form with personal data. The data is encrypted and sent to the systems through the hospital network, as necessary. The patient, then, is sent to the screening room to answer more questions and thus help medical personnel assess their situation. He receives a wearable device to monitor his health status. All the information collected about the patient and their health status is included in their digital record. If communication with other systems is required, the information to be sent is encrypted.

The link between this device and the patient’s file allows the information to be collected without a health professional’s intervention. Based on the information provided by the wearable, the system makes a temperature analysis. If the patient remains in a feverish state, they are referred to the doctor’s office. Since fever is one of the symptoms that prevail in detecting COVID-19, its absence can prompt a discharge. However, the lack of fever is not a guarantee that there is no infection with the virus [ 48 ], so careful monitoring is needed. In addition to the factors described, comparative tests were performed to validate the application based on the initially defined requirements in the taxonomy.

Pseudo-Code

The algorithms applied in the development of the prototype application are described below in pseudo-code format. Pseudo-code covers the generation of a service number, temperature monitoring, and referral in case of emergency.

Algorithm 1 deals with the generation of the service number, where the patient’s data will be saved in an encrypted form and forwarded to the monitoring room. In the monitoring room, the service number to be linked to the customer will be generated.

POST New medical care
1 Service Number;
  : Attendance number
2 save encrypted packet data;
3 send to monitoring room

Algorithm 2 deals with the process of monitoring temperature. The wearable IoT device collects the patient’s temperature during the period defined by the medical team and sends it to the server for the monitor process. First, if it is higher than 38.5 °C, the patient is referred to the ICU. If it is equal to or above 37 °C for five minutes, the patient is referred to another ward for medical assistance. Finally, if it is less than 37 °C for ten minutes, the patient can be released.

Monitoring

Algorithm 3 deals with the alert generated for the ICU in cases where the patient is classified as an emergency. If there is no emergency, the alert is generated for the doctor, informing that the patient will be referred for care.

Alert

6. Tests and Results

The flow of controlled information in the application starts after the registration data has been filled in; it is also possible to apply other requirements such as encryption to the patient’s data. Figure 11 illustrates the integration of basic patient information and reports that the patient was sent for temperature control. The temperature was captured and sent to the system, which will classify the feverish state, suggesting different referrals for each scenario. If the patient exhibits a feverish state and has other symptoms that may characterize COVID-19, their care must be provided in a differentiated way.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g011.jpg

Saved Information.

When choosing the type of encryption in the data registration process, the data security level increases, and the information should only be made available to those who have permission. For prototype demonstration purposes, we use the AES symmetric key encryption method. The encryption application aims to secure data while transferring it to other devices. Figure 12 shows the encrypted patient registration data.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g012.jpg

AES Encryption.

After the patient has been registered, and the information is stored safely, the data is sent to a system that continually gets updates on body temperature. With this prototype’s application, we also tested the hypothesis that an IoT device can monitor the patient for changes in temperature. To test the idea, we implemented a set of random values read by the program to simulate this monitoring process. Every minute, the device will check the temperature of the patients who have entered the system and are waiting at the emergency room’s reception. If their temperature can be characterized as feverish, then they are taken to the office with priority. Figure 13 describes the monitoring of a patient whose temperature remains stable, and hospital discharge is suggested.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g013.jpg

Patient record simulating discharge.

If the patient’s state remains feverish for five minutes, a message will be sent to the doctor in charge, as shown in Figure 14 . If the temperature remains stable for ten minutes, the patient will be released.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g014.jpg

Patient registry simulating medical care admittance.

After testing and validating the application, it was possible to observe that the information flows through different devices. For the simulation environment, we experimented with only one system that communicates with a wearable device. In real applications, there could be more than one device interacting with more than one method. However, the information’s fluidity would be similar: the patient’s registration at the time of admission to the emergency room, the system being accessed by the screening sector to insert health status data, and the information is received from monitoring devices. At the medical consultation time, the system would receive more details regarding anamnesis, referrals for exams, or hospital discharge.

Given that the feverish state is strongly associated with a COVID-19 diagnosis, the patient should be monitored continuously and receive adequate care as long as the symptoms persist. The high contagion of the virus makes such care essential. The monitoring interval parameters, indicative of medical discharge or a possible disease carrier, are defined according to medical protocols. We emphasize that the interval and discharge suggestion present in this work are meant to simulate features.

7. Conclusions

The COVID-19 scenario requires particular solutions for providing the emergency care process and security in the data generated in all environments. In this sense, this work proposed a taxonomy that was designed to support the development of privacy mechanisms for health environments.

The taxonomy is branched into four items containing five attributes each; all the items and their respective attributes are justifiable. For the information flow tests, we developed a prototype and application that addresses the main questions about data privacy despite being simple. The application was developed with registration data inputs and different encryption/hash to be applied according to environmental criteria. The application communicates with a wearable that monitors the patient’s temperature and provides treatment in line with the patient’s feverish state, guiding the referral to the doctor’s office or the possibility of discharge. With the application of taxonomic definitions and the agility of medical professionals in the care of patients with suspected COVID-19, the registration data is kept confidential through encryption and privacy requirements. Temperature monitoring should be continuously done; in the case of feverish states that persist for a period defined by the entity and other symptoms suggestive of the disease, the system suggests the patient’s referral without exposing personal data.

The main contribution of this research consists of the analysis of different privacy parameters with a mobile application that considers the different rules proposed in our taxonomy. There is no concrete analysis previously performed that analyzes the privacy constraints with a mobile application. Mobile technologies are commonly used by people, and it may help in the prevention of COVID-19. In addition, more search should be performed, and the taxonomy developed may be improved to be adapted with the real world.

We believe that the research we have carried out contributes to several other studies currently in progress in several countries, which propose monitoring without consent and put forward definitions of use and data privacy criteria. For future work, we are developing improvements for privacy requirements that can be adapted to different countries, thus expanding variable monitoring features to identify patients with COVID-19 and obtain new tests and results.

Acknowledgments

This work was partially supported by CAPES—Financial code (001). The authors also like to acknowledge the collaboration of Computer Laboratory 7 of Instituto de Telecomunicações—IT Branch Covilhã—Portugal. Furthermore, we would like to thank the Politécnico de Viseu for their support.

Abbreviations

The following abbreviations are used in this manuscript:

AESAdvanced Encryption Standard
CIAConfidentiality, Integrity and Availability
CPFCadastro de Pessoa Física
COVID-19Coronavirus 2
COVID-19 SARS-CoV-2Severe Acute Respiratory Syndrome Coronavirus 2
EHRElectronic Health Record
GDPRGeneral Data Protection Regulation
HIPAADepartment of Health and Human Services, 202l
HCPPHealthcare system for Patient Privacy
ICUIntensive Care Unit
IoTInternet of Things
LGPDGeneral Data Protection Law
mHealthMobile Health
P-devicePrivate Device
UMLUnified Modeling Language

Author Contributions

Conceptualization, A.V.L., L.A.S., L.G.; Investigation, A.V.L., L.A.S. and V.R.Q.L. Methodology, L.A.S. and A.V.L.; Project Administration, V.R.Q.L.; Resources, I.M.P.; Supervision, V.R.Q.L.; Validation, V.R.Q.L. and L.A.S.; Writing—original draft, A.V.L., R.L. and L.G. Writing—review and editing, V.R.Q.L., L.A.S., X.M., J.L.V.B. and I.M.P., Financial R.G.O. and V.R.Q.L. All authors have read and agreed to the published version of the manuscript.

The project Smart following systems, Edge Computing and IoT Consortium, CONSORCIO TC_TCUE18-20_004, CONVOCATORIA CONSORCIOTC. PLAN TCUE 2018-2020. Project managed by Fundación General de la Universidad de Salamanca and co-financed with Junta Castilla y León and FEDER funds. This work was partially supported by Fundação para a Ciência e a Tecnologia under Project UIDB/04111/2020. This work was partially funded by FCT/MEC through national funds and co-funded by FEDER–PT2020 partnership agreement under the project UIDB/50008/2020. This work was partially funded by National Funds through the FCT—Foundation for Science and Technology, I.P., within the scope of the project UIDB/00742/2020.

Conflicts of Interest

The authors declare no conflict of interest.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

IEEE Account

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

sensors-logo

Article Menu

  • Subscribe SciFeed
  • Recommended Articles
  • PubMed/Medline
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

A case study on the development of a data privacy management solution based on patient information.

case study data privacy

1. Introduction

2. related work, 3. taxonomy, 3.1. user parameter.

  • the patient himself/herself;
  • the clinical network that will care for the patient, including doctors, family members, volunteers, health insurance provider, among other things;
  • the hospital;
  • smart home as an environment with ubiquitous equipment’s capable of providing security and quality of life;
  • the environment in which the patient works, the vehicle with which the patient is transferred to the clinical center.

3.2. Environment Parameter

3.3. privacy parameter, 3.4. device, 4. project modeling, 4.1. use cases diagrams, 4.2. sequence diagrams, 4.3. context diagrams, 5. prototype, pseudo-code.

POST New medical care
1 Service Number;
  : Attendance number
2 save encrypted packet data;
3 send to monitoring room
Monitoring
Alert

6. Tests and Results

7. conclusions, author contributions, acknowledgments, conflicts of interest, abbreviations.

AESAdvanced Encryption Standard
CIAConfidentiality, Integrity and Availability
CPFCadastro de Pessoa Física
COVID-19Coronavirus 2
COVID-19 SARS-CoV-2Severe Acute Respiratory Syndrome Coronavirus 2
EHRElectronic Health Record
GDPRGeneral Data Protection Regulation
HIPAADepartment of Health and Human Services, 202l
HCPPHealthcare system for Patient Privacy
ICUIntensive Care Unit
IoTInternet of Things
LGPDGeneral Data Protection Law
mHealthMobile Health
P-devicePrivate Device
UMLUnified Modeling Language
  • Marques, G.; Saini, J.; Pires, I.M.; Miranda, N.; Pitarma, R. Internet of Things for Enhanced Living Environments, Health and Well-Being: Technologies, Architectures and Systems. In Handbook of Wireless Sensor Networks: Issues and Challenges in Current Scenario’s ; Singh, P.K., Bhargava, B.K., Paprzycki, M., Kaushal, N.C., Hong, W.C., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 616–631. [ Google Scholar ] [ CrossRef ]
  • Pires, I.M.; Marques, G.; Garcia, N.M.; Flórez-Revuelta, F.; Ponciano, V.; Oniani, S. A Research on the Classification and Applicability of the Mobile Health Applications. J. Pers. Med. 2020 , 10 , 11. [ Google Scholar ] [ CrossRef ] [ PubMed ] [ Green Version ]
  • Doukas, C.; Maglogiannis, I. Bringing IoT and Cloud Computing towards Pervasive Healthcare. In Proceedings of the 2012 Sixth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing, Palermo, Italy, 4–6 July 2012; pp. 922–926. [ Google Scholar ] [ CrossRef ]
  • Al-Odat, Z.A.; Srinivasan, S.K.; Al-qtiemat, E.; Dubasi, M.A.L.; Shuja, S. IoT-Based Secure Embedded Scheme for Insulin Pump Data Acquisition and Monitoring. arXiv 2018 , arXiv:1812.02357. [ Google Scholar ]
  • Farahani, B.; Firouzi, F.; Chang, V.; Badaroglu, M.; Constant, N.; Mankodiya, K. Towards fog-driven IoT eHealth: Promises and challenges of IoT in medicine and healthcare. Future Gener. Comput. Syst. 2017 . [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Campos, J.; Souza, V.S.A. Percepção dos Usuários do Serviço de Urgência e Emergência em relação à classificação de risco pelo protocolo de Manchester. Rev. Unim. Cient. Montes Claros 2014 , 16 . Available online: https://doi.org/http://www.ruc.unimontes.br/index.php/unicientifica/article/view/319/297 (accessed on 8 April 2020).
  • Rothan, H.; Siddappa, N. The epidemiology and pathogenesis of coronavirus disease (COVID-19) outbreak. J. Autoimmun. 2020 , 109 , 102433. [ Google Scholar ] [ CrossRef ]
  • Liang, Y.; Liang, J.; Zhou, Q.; Li, X.; Lin, F.; Deng, Z.; Zhang, B.; Li, L.; Wang, X.; Zhu, H.; et al. Prevalence and clinical features of 2019 novel coronavirus disease (COVID-19) in the Fever Clinic of a teaching hospital in Beijing: A single-center, retrospective study. medRxiv 2020 . [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Soares, N.V.; Dall’Agnol, C.M. Privacidade dos pacientes: Uma questão para a geração do cuidado em enfermagem. Acta Paul. Enferm. 2011 , 24 , 683–688. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Barker, K.; Askari, M.; Banerjee, M.; Ghazinour, K.; Mackas, B.; Majedi, M.; Pun, S.; Williams, A. A Data Privacy Taxonomy. In British National Conference on Databases ; Springer: Berlin/Heidelberg, Germany, 2009; Volume 5588, pp. 42–54. [ Google Scholar ] [ CrossRef ]
  • Asaddok, N.; Ghazali, M. Exploring the usability, security and privacy taxonomy for mobile health applications. In Proceedings of the 2017 International Conference on Research and Innovation in Information Systems (ICRIIS), Langkawi, Malaysia, 16–17 July 2017; pp. 1–6. [ Google Scholar ] [ CrossRef ]
  • Coen-Porisini, A.; Colombo, P.; Sicari, S.; Trombetta, A. A Conceptual Model for Privacy Policies. In Proceedings of the 11th IASTED International Conference on Software Engineering and Applications (SEA ’07), Cambridge, MA, USA, 19–21 November 2007; ACTA Press: Anaheim, CA, USA, 2007; pp. 570–577. [ Google Scholar ] [ CrossRef ]
  • Silva, L.A.; Leithardt, V.R.Q.; Rolim, C.O.; González, G.V.; Geyer, C.F.R.; Silva, J.S. PRISER: Managing Notification in Multiples Devices with Data Privacy Support. Sensors 2019 , 19 , 3098. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Leithardt, V.; Santos, D.; Silva, L.; Viel, F.; Zeferino, C.; Silva, J. A Solution for Dynamic Management of User Profiles in IoT Environments. IEEE Lat. Am. Trans. 2020 , 18 , 1193–1199. [ Google Scholar ] [ CrossRef ]
  • Zwitter, A.; Gstrein, O.J. Big data, privacy and COVID-19–learning from humanitarian expertise in data protection. J. Int. Humanit. Action 2020 , 5 . [ Google Scholar ] [ CrossRef ]
  • Yesmin, T.; Carter, M.W. Evaluation framework for automatic privacy auditing tools for hospital data breach detections: A case study. Int. J. Med. Inform. 2020 , 138 . [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Islam, S.M.R.; Kwak, D.; Kabir, M.H.; Hossain, M.; Kwak, K. The Internet of Things for Health Care: A Comprehensive Survey. IEEE Access 2015 , 3 , 678–708. [ Google Scholar ] [ CrossRef ]
  • Sun, J.; Zhu, X.; Zhang, C.; Fang, Y. HCPP: Cryptography Based Secure EHR System for Patient Privacy and Emergency Healthcare. In Proceedings of the 2011 31st International Conference on Distributed Computing Systems, Minneapolis, MN, USA, 20–24 June 2011; pp. 373–382. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Samaila, M.; Neto, M.; Fernandes, D.; Freire, M.; Inácio, P. Challenges of Securing Internet of Things Devices: A survey. Secur. Priv. 2018 , 1 . [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Plachkinova, M.; Andrés, S.; Chatterjee, S. A Taxonomy of mHealth Apps–Security and Privacy Concerns. In Proceedings of the 2015 48th Hawaii International Conference on System Sciences, Kauai, Hawaii, 5–8 January 2015; pp. 3187–3196. [ Google Scholar ] [ CrossRef ]
  • Alsubaei, F.; Abuhussein, A.; Shiva, S. Security and Privacy in the Internet of Medical Things: Taxonomy and Risk Assessment. In Proceedings of the 2017 IEEE 42nd Conference on Local Computer Networks Workshops (LCN Workshops), Singapore, 9–12 October 2017; pp. 112–120. [ Google Scholar ] [ CrossRef ]
  • Hankerson, D.; Menezes, A.J.; Vanstone, S. Guide to Elliptic Curve Cryptography ; Springer Science & Business Media: Berlin, Germany, 2005; Volume 46, p. 13. [ Google Scholar ]
  • Yi, X.; Bertino, E.; Rao, F.Y.; Bouguettaya, A. Practical privacy-preserving user profile matching in social networks. In Proceedings of the 2016 IEEE 32nd international conference on data engineering (ICDE), Helsinki, Finland, 16–20 May 2016; pp. 373–384. [ Google Scholar ] [ CrossRef ]
  • Sivaraman, V.; Gharakheili, H.H.; Vishwanath, A.; Boreli, R.; Mehani, O. Network-level security and privacy control for smart-home IoT devices. In Proceedings of the 2015 IEEE 11th International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob), Abu Dhabi, UAE, 19–21 October 2015; pp. 163–167. [ Google Scholar ] [ CrossRef ]
  • Stojanović, R.; Škraba, A.; Lutovac, B. A Headset Like Wearable Device to Track COVID-19 Symptoms. In Proceedings of the 2020 9th Mediterranean Conference on Embedded Computing (MECO), Budva, Montenegro, 8–11 June 2020; pp. 1–4. [ Google Scholar ]
  • Cecilia, J.M.; Cano, J.; Hernández-Orallo, E.; Calafate, C.T.; Manzoni, P. Mobile crowdsensing approaches to address the COVID-19 pandemic in Spain. IET Smart Cities 2020 , 2 , 58–63. [ Google Scholar ] [ CrossRef ]
  • Riza, T.A.; Gunawan, D. IEEE 802.11ah Network Challenges Supports Covid-19 Prevention Team. In Proceedings of the 2020 IEEE 10th International Conference on Electronics Information and Emergency Communication (ICEIEC), Beijing, China, 17–19 July 2020; pp. 73–76. [ Google Scholar ]
  • Zeinalipour-Yazti, D.; Claramunt, C. COVID-19 Mobile Contact Tracing Apps (MCTA): A Digital Vaccine or a Privacy Demolition? In Proceedings of the 2020 21st IEEE International Conference on Mobile Data Management (MDM), Versailles, France, 30 June–3 July 2020; pp. 1–4. [ Google Scholar ]
  • Feistel, H. Cryptography and Computer Privacy. Sci. Am. 1973 , 228 , 10. [ Google Scholar ] [ CrossRef ]
  • Cesconetto, J.; Augusto Silva, L.; Bortoluzzi, F.; Navarro-Cáceres, M.; Zeferino, C.A.; Leithardt, V.R.Q. PRIPRO—Privacy Profiles: User Profiling Management for Smart Environments. Electronics 2020 , 9 , 1519. [ Google Scholar ] [ CrossRef ]
  • Vital, L.P.; Café, L.M.A. Ontologias e taxonomias: Diferenças. Perspect. CiÊNc. Inform. 2011 , 16 , 115–130. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Fengou, M.; Mantas, G.; Lymperopoulos, D.; Komninos, N. Ubiquitous Health Profile Management Applying Smart Card Technology. In International Conference on Wireless Mobile Communication and Healthcare ; Springer: Berlin/Heidelberg, Germany, 2011; Volume 83. [ Google Scholar ] [ CrossRef ]
  • Leithardt, V.; Borges, G.; Rossetto, A.; Rolim, C.; Geyer, C.; Correia, L.; Nunes, D.; Sá Silva, J. A Privacy Taxonomy for the Management of Ubiquitous Environments. J. Commun. Comput. 2013 , 10 , 1529–1553. [ Google Scholar ] [ CrossRef ]
  • Senthilkumar, S.; Viswanatham, V.M. HB-PPAC: Hierarchy-based privacy preserving access control technique in public cloud. Int. J. High Perform. Comput. Netw. 2017 , 10 , 13. [ Google Scholar ] [ CrossRef ]
  • Ibraimi, L.; Asim, M.; Petković, M. Secure Management of Personal Health Records by Applying Attribute-Based Encryption. In Proceedings of the 6th International Workshop on Wearable, Micro, and Nano Technologies for Personalized Health, Oslo, Norway, 24–26 June 2009; pp. 71–74. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Centers for Medicare & Medicaid Services. The Health Insurance Portability and Accountability Act of 1996 (HIPAA). 1996. Available online: http://www.cms.hhs.gov/hipaa/ (accessed on 25 August 2020).
  • Cooper, T.; Collman, J. Managing Information Security and Privacy in Healthcare Data Mining. In Medical Informatics: Knowledge Management and Data Mining in Biomedicine ; Chen, H., Fuller, S.S., Friedman, C., Hersh, W., Eds.; Springer: Boston, MA, USA, 2005; pp. 95–137. [ Google Scholar ] [ CrossRef ]
  • Costa, R.G.R. Apontamentos para a arquitetura hospitalar no Brasil: Entre o tradicional e o moderno. Hist. Cienc. Saude-Manguinhos 2011 , 18 , 53–66. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Lopes, S. Data Privacy in Interoperability Contexts—The Area of Health. Ph.D. Thesis, Universidade de Évora, Évora, Portugal, 2016. [ Google Scholar ]
  • Yeniman Yildirim, E.; Akalp, G.; Aytac, S.; Bayram, N. Factors Influencing Information Security Management in Small- and Medium-Sized Enterprises: A Case Study from Turkey. Int. J. Inf. Manag. 2011 , 31 , 360–365. [ Google Scholar ] [ CrossRef ]
  • Florence, G.; Calil, S.J. Uma nova perspectiva no controle dos riscos da utilização de tecnologia médico-hospitalar. MultiCiência 2005 , 5 , 1–14. [ Google Scholar ]
  • Machado, D.; Doneda, D. Proteção de Dados Pessoais e Criptografia: Tecnologias Criptográficas Entre Anonimização e Pseudonimização de Dados. Rev. Trib. 2019 , 998 , 99–125. [ Google Scholar ]
  • Brasil. Lei n. 13.079, de 14 de Agosto de 2018. Lei Geral de Proteção de Dados Pessoais (LGPD). 2018. Available online: http://www.planalto.gov.br/ccivil_03/_ato2015-2018/2018/lei/L13709.htm (accessed on 30 August 2020).
  • Lupiana, D.; O’Driscoll, C.; Mtenzi, F. Taxonomy for ubiquitous computing environments. In Proceedings of the 2009 First International Conference on Networked Digital Technologies, Ostrava, Czech Republic, 28–31 July 2009; pp. 469–475. [ Google Scholar ] [ CrossRef ]
  • Pradilla, J.; Esteve, M.; Palau, C. SOSFul: Sensor Observation Service (SOS) for Internet of Things (IoT). IEEE Lat. Am. Trans. 2018 , 16 , 1276–1283. [ Google Scholar ] [ CrossRef ]
  • Fowler, M. UML Essencial: Um Breve Guia Para Linguagem Padrao ; Bookman: Orange, CA, USA, 2005. [ Google Scholar ]
  • Menezes, A.J.; Vanstone, S.A.; Oorschot, P.C.V. Handbook of Applied Cryptography , 1st ed.; CRC Press, Inc.: Boca Raton, FL, USA, 1996. [ Google Scholar ]
  • Singhal, T. A Review of Coronavirus Disease-2019 (COVID-19). Indian J. Pediatr. 2020 , 87 , 281–286. [ Google Scholar ] [ CrossRef ] [ PubMed ] [ Green Version ]

Click here to enlarge figure

WorkCryptographyPrivate ProfileDevicesTaxonomy
[ ] (2007)
[ ] (2009)
[ ] (2015)
[ ] (2015)
[ ] (2015)
[ ] (2017)
[ ] (2017)
[ ] (2019)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
Proposal
MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

Verri Lucca, A.; Augusto Silva, L.; Luchtenberg, R.; Garcez, L.; Mao, X.; García Ovejero, R.; Miguel Pires, I.; Luis Victória Barbosa, J.; Reis Quietinho Leithardt, V. A Case Study on the Development of a Data Privacy Management Solution Based on Patient Information. Sensors 2020 , 20 , 6030. https://doi.org/10.3390/s20216030

Verri Lucca A, Augusto Silva L, Luchtenberg R, Garcez L, Mao X, García Ovejero R, Miguel Pires I, Luis Victória Barbosa J, Reis Quietinho Leithardt V. A Case Study on the Development of a Data Privacy Management Solution Based on Patient Information. Sensors . 2020; 20(21):6030. https://doi.org/10.3390/s20216030

Verri Lucca, Arielle, Luís Augusto Silva, Rodrigo Luchtenberg, Leonardo Garcez, Xuzeng Mao, Raúl García Ovejero, Ivan Miguel Pires, Jorge Luis Victória Barbosa, and Valderi Reis Quietinho Leithardt. 2020. "A Case Study on the Development of a Data Privacy Management Solution Based on Patient Information" Sensors 20, no. 21: 6030. https://doi.org/10.3390/s20216030

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

The ICO exists to empower you through information.

Case studies and examples

Share this page.

  • Share via Reddit
  • Share via LinkedIn
  • Share via email

Our data sharing code provides real-world examples and case studies of different approaches to data sharing, including where organisations found innovative ways to share data while protecting people’s information.

Here are some case studies additional to those in the code.

Data sharing to improve outcomes for disadvantaged children and families

Sharing with partners in the voluntary or private sector, landlord and tenant data sharing, sharing medical records of care home residents, ensuring children’s welfare: data sharing by local authorities with ofsted, the regulator of social care and early years provision in england, effective information sharing between the police and ofsted in england.

  • Sharing medical records between GP practice and hospital trust

Improving data sharing processes and practices at an NHS trust

Improving health services with responsible data sharing, sharing health data for research purposes.

Social workers frequently need access to information about children and their families when deciding whether there is a safeguarding risk and what support is most appropriate.

Two councils in different areas of the UK partnered with a not-for-profit organisation to find a data sharing solution where social workers would have all the information they need from the start.

After extensive user research and workshops with stakeholders and families, they found that social workers needed access to the contact details of the lead practitioner of a case from other services (police, housing, schools and adult social care), and basic information about when the service was last involved with the family. The research found that sharing such data would:

  • reduce the amount of time social workers spend looking for information;
  • enable more joint working among services (eg children’s social care working more closely with adult social care);
  • ensure social workers have access to all the information they need when assessing safeguarding risk and making support decisions for children and their families; and
  • allow children and families to access better, more timely services.

At the same time, the two councils and the not-for-profit organisation explored the information governance and ethical implications of accessing and using sensitive personal data within social care. They ran ethics workshops with the project team and conducted user research with those most likely to be affected by the data sharing (residents who have had contact with social care and social workers).

The research enabled the two councils to design, build and embed a digital data sharing solution that empowers social workers, enables professional judgement, protects privacy, and ultimately enables children and their families to access the right support and reach their potential.

A group of voluntary sector organisations worked with health and social care partners (both private and public sectors) on a project to deliver improved outcomes for older people in the community and in hospital.

The project team recognised that it needed to establish a culture of shared information, along with a phased, proactive approach to seeking individuals’ consent. It also recognised that the involvement of volunteers could have implications for the sharing of data within the project team, as they have a different legal status to the agencies’ employees and might not have received the same level of training as employees in the work of the organisation.

The project was set up as follows:

  • The volunteers signed contracts setting out their roles, responsibilities and standards - including those for information security - equivalent to those of the agencies’ employees. The contracts were intended to formalise and support the volunteers’ responsibilities for gathering and sharing information. Training and ongoing support were provided to the volunteers.
  • GPs asked their elderly patients whether they would like to take part in the project. They were asked specifically whether they agreed to relevant information from their health record being shared with a multi-disciplinary project team consisting of health, social care and voluntary sector practitioners.
  • At the initial home visit, the volunteer explained the information-sharing aspects of the service and asked for written consent.
  • All of the organisations and GP practices involved in the project entered into a single data sharing agreement. This built accountability and trust between the agencies involved.

Note it was important to consider whether the necessary legal power or ability to share personal data was in place. The legal power is separate from the lawful basis for data processing.

A housing association occasionally received requests from organisations such as utility companies, debt collectors and councils for information about current and former tenants. However it was considered not to be appropriate to enter into a data sharing agreement as the sharing was not on a regular basis.

On one occasion, a utility company contacted the housing association and asked for the forwarding address of a former tenant who was in arrears on his gas and electricity account. The housing association disclosed the information because they had advised tenants at the start of their tenancy that they would make such disclosures because of the contractual relationship between tenants and the utility company. All tenants had agreed to this.

On another occasion, a debt collection company acting for a third party contacted the housing association for the forwarding address of a former tenant. The housing association decided that it could not disclose the information because it had no lawful basis for the disclosure. It withheld the tenant’s new address from the debt collection company.

The housing association dealt with requests for information effectively because it had put a system in place which required a senior person or group of people, trained in data protection, to decide whether or not to release personal information on a case-by-case basis.

This involved verifying the identity of the requester, insisting that all requests were in writing and ensuring that the requester provided enough information to make a proper decision. If the housing association decided to share the information, they only provided relevant, necessary information and, in every case, they made a record of the disclosure decision.

Staff in a privately-owned care home did not have access to the recent medical history of residents. Instead, the home used to phone the GP practice or call out a GP every time they needed more information. This could be a risk, as the staff might need to check quickly what medicines residents were taking and at what dosages.

To make the process more efficient, the care home and the local GP practice signed up to a formal data sharing agreement, so the care home staff would have access to their residents’ electronic medical records when necessary.

The GP practice and local Clinical Commissioning Group made potential residents aware that if they were admitted to the care home there was a possibility that their medical record would be accessed. In addition, when patients were admitted to the care home, their explicit consent - or that of their representatives – was sought before their electronic medical record was accessed. Where consent was not provided, the former system of contacting a GP would continue to be used.

Other key features of the data sharing agreement were:

  • access to residents’ records could only take place while they were under the care of the home;
  • access was restricted to the clinical and professional nursing staff at the care home;
  • access was only allowed where this was necessary to provide treatment and for residents’ safety;
  • access was restricted to information relevant to the provision of care to residents;
  • access to the information was by secure means; and
  • the information obtained was held securely and in accordance with good information management practice.

A formal data sharing agreement can put in place effective safeguards for residents and can ensure the various parties involved in data sharing are working to a common set of rules. An agreement can also help to deal with the ethical and confidentiality issues that can arise in health and social care.

Even if there is a data sharing agreement in place, organisations still need to make sure that individuals whose data may be shared are aware of what is taking place. This can be done through the privacy information they provide, using various methods. In the circumstances outlined here, it might be more effective to talk to individuals to explain the situation and to find out whether they agree to their information being shared. Their decision needs to be documented.

Data sharing can help ensure the welfare of children and other vulnerable individuals.

This example concerns the sharing of personal data by staff in local authorities with Ofsted, in its role as the regulator of social care and early years provision in England.

The example focuses in particular on the role of the Local Authority Designated Officer (LADO), who is responsible for managing child protection concerns or allegations made against staff and volunteers who work with children and young people.

Data protection enables fair and proportionate data sharing. That means that LADOs should be confident they can share relevant information with other local authorities and with Ofsted. The information shared by LADOs helps Ofsted to build a complete picture about an individual’s suitability to provide services to children.

Mr D wants to register with Ofsted to provide a holiday play scheme for children in the Westtown Borough Council area. He has previously worked in a setting providing social care to children in the Easttown Borough Council area. His home is in the Northtown Borough Council area.

In order for Ofsted to reach a properly informed judgement on the suitability of anyone to provide services to children, it needs all relevant information about them. It is essential for the LADOs in Easttown and Northtown to share the information about Mr D with Ofsted when requested. This is the case irrespective of where Mr D lives or works.

This data sharing is vital, in order for Ofsted’s registration system to be effective in ensuring the safety of children.

The Chief Constable in Barsetshire police force promotes a culture where the safety of children is paramount. That includes officers in the force alerting authorities and sharing information appropriately to protect children from harm.

Officers are familiar with the role of Ofsted as the regulator in England of early years settings including childminders and nurseries, and of children’s social care services including children’s homes. Because of this, officers know that the information they share can be used by Ofsted to make children safe.

The force has provided a named contact that Ofsted staff can get in touch with, if they need to talk about concerns at any institution that Ofsted inspects or regulates. The police have been given a regional contact in Ofsted that they can get in touch with about any new information.

Police receive a call-out to a children’s home because a child has gone missing. This is not the first occasion that this child has gone missing. The child has a history of unexplained absences and is found hanging around in a local park with older young people, some of whom are known to police as gang members.

The police officers have two linked concerns that lead to them sharing information with authorities: the safety of the child who went missing, and the safety of the children’s home.

Actions taken by the police officers:

1. To safeguard the child, they contact the children’s social care team in the local authority and share information with social workers about the child’s involvement with the gang.

2. The police also contact Ofsted to tell them they are concerned that there have been multiple police call-outs to this children’s home because of children going missing. The children are vulnerable and the police consider they are at a high risk of involvement with a local gang.

This information is valuable to Ofsted who can use it to help the young people concerned. The children’s home had notified Ofsted about the child going missing, but they did not include information about the child being at risk of gang involvement.

Ofsted now considers the intelligence from police and, under its regulatory role, decides to visit the children’s home to find out what the manager and staff are doing to keep children safe and to reduce the risk of children being groomed by local gangs.

An inspector from Ofsted visits the home and finds that staff were unaware of the possible gang involvement by children in the home. Staff had not talked to children to find out where they were going or what they were doing, and although they had noticed some changes in the behaviour of the child who went missing, they had not recorded this or notified the child’s social worker. The inspector’s view is that safeguarding arrangements in the home do not appear to comply with the relevant regulations.

Because of the information shared by the police and the findings of the inspector, Ofsted is able to take regulatory action to ensure that safeguarding arrangements at the children’s home are improved. Ofsted schedules further visits to monitor practice at the home and to check that improvements have been made. The inspector continues to liaise with police to monitor the welfare of children in the home.

Sharing medical records between a GP practice and hospital trust

These scenarios apply to England only, but the general principles are relevant in  Northern Ireland, Scotland and Wales where health services are a devolved matter.

A GP practice received a request for the records of one of their patients. They are receiving care in a hospital in another part of the country. This is outside of the local shared care record initiative, which is a system that governs patient records sharing locally. The practice is confused about whether they require the patient’s consent to share the data.

Health and care settings often use the concept of consent. However, it is often misunderstood due to the use of the term in different contexts. In this case, the consent required to view and share confidential medical information is different from the consent that the data protection legislation defines as a lawful basis for processing personal data.

To help the GP practice, the hospital directs them to information available on the NHS IG Portal , a service that provides specific information governance advice to organisations that provide care services. The hospital also reminds the care setting of their responsibilities under the Health and Social Care (Quality and Safety) Act 2015 and Caldicott Principle 7 . This allows them to share someone’s personal data where it is likely to enable them to receive health or social care, and this is in their best interests.

After reading the guidance, the practice understands how this separate legal requirement for consent in a health and social care context interacts with consent as a lawful basis under data protection legislation. In this circumstance (they are sharing data for direct care purposes), they can share the data without the explicit consent of the patient. Their consent is implied due to the provision of health and care (ie, it is within the reasonable expectation of the patient for the care home to share information for these purposes). In addition, health and care staff have a legal duty to share information to support direct care.

Through the use of sector-specific guidance, organisations can reach a shared understanding of the data protection requirements for sharing data. This can reduce the friction that occurs between organisations as they consider their separate obligations under data protection law.

After receiving criticism that their procedures are hindering data sharing, an NHS trust’s information governance department establishes a new process within the organisation. This ensures people consult them in good time as part of any new processing activity that requires personal data.

In order to do this, they:

  • seek senior or executive level support for the proposal eg by the Senior Information Risk Owner (SIRO) or board where applicable;  
  • identify and review the points within the organisation where they establish new data processing activities and build information governance into business case and procurement checklists;
  • ensure timescale allocation for setting up required legal and governance documents such as data sharing agreements;
  • devise new template data protection impact assessments and data sharing agreements for organisations to use to simplify their processes;
  • provide training to the relevant staff and issue further communications across the organisation to highlight the new processes;
  • build professional networks with information governance colleagues in local organisations to learn best practice approaches and improve the information governance culture;
  • establish a review process to help understand occasions where they could not share data and apply the lessons learnt to future data sharing plans; and
  • hold a drop-in ‘meet the team’ session or issue an information sheet about their work and how their early participation will benefit colleagues.

Following this review and process redesign, the information governance team are now informed in good time about any new processing. They can ensure the team takes the appropriate governance steps before new processing takes place.

A healthcare care provider is looking to improve the services they offer their patients. By sharing appropriate levels of data with other care organisations in the area, the organisation realises they can improve services. However, the organisation traditionally avoids risk when it comes to sharing data. This adversely impacts the quality of care they can provide.

As the organisation looks to improve their data sharing practices, they decide to find ways they can assure themselves that whenever they shares data, they are doing so responsibly. They want to make sure they are adhering to the requirements of data protection law, common law and their responsibilities to their service users.

They refer to the considerations the ICO lays out in the data sharing checklist in the data sharing code of practice . The organisation builds on this by adding the following checks:

  • The status of the organisation (with respect to the legal powers provided by the Health and Social Care Act 2015 etc).
  • The nature of the processing and purpose for which the organisation needs to share data.
  • The status of the organisation they plan to share the data with, which could include reviewing the information in the NHS Data Security and Protection Toolkit (DSPT).
  • Other appropriate due diligence checks such as the NHS Digital Technology Assessment Criteria (DTAC).
  • The amount of data being requested for the purpose or purposes they are using it for.
  • The necessity for the data sharing (does it need to happen, or can the organisation achieve the purpose another way, for example using anonymised data?)
  • Ensuring the organisation has suitably informed the patients or service users of the proposed sharing and of their data subject rights.

After implementing this approach, the organisation feels more confident about sharing data. By keeping a record of their decisions, they are also demonstrating their accountability for their actions.

A hospital trust is preparing to trial a medical device that they are developing to support clinical decision-making for patients suffering from heart disease. The device is a data-driven app that applies a risk model based on details from the patient’s medical history. Although members of the trust’s clinical team developed the risk model, a third-party private company are developing the app itself.

The trust wishes to use patient data to support the research phase of the app development, which is part of the approval process for medical devices. This involves sharing patient data with the app’s developers for research purposes. As the app developer will need health information, which is capable of identifying people for this research, the hospital trust needs a legal basis for lifting the common law obligation of confidentiality to disclose and use the information for the purposes of this research programme. Before the trust shares the data, they consider a number of questions as part of their data protection impact assessment (DPIA), which include the following:

  • What is the lawful basis under UK GDPR to process this data?
  • What can they do to minimise the amount of data they need to process to effectively perform this task?
  • Will the trust be able to get explicit consent (common law) from each patient to view their medical information for this purpose? Is this practical? Are there other ways to satisfy the common law?
  • What approvals do they require in order to carry out the research?

Following a review of guidance relating to confidentiality and consent available on the NHS IG Portal , the Trust understands that they can identify a lawful basis under UK GDPR. However, for common law purposes they need to make an application to the Confidentiality Advisory Group (CAG) under section 251 of the NHS Act 2006 for advice on whether the research group can access the data without the patients’ explicit consent. This is because the purpose of the processing is not direct care, and they do not have the implied consent of the patient to access this data (under common law).

Following a successful CAG application and approval, the trust could share the information from their patient records in order to carry out this research. Analysis of the confidential patient information meant that the trust could confirm the effectiveness of their risk model and seek approval for their medical device.

The International Forum for Responsible Media Blog

  • Table of Media Law Cases
  • About Inforrm
  • Search for: Search Button

Top 10 Privacy and Data Protection Cases of 2018: a selection

case study data privacy

  • Cliff Richard v. The British Broadcasting Corporation [2018] EWHC 1837 (Ch) .

This was Sir Cliff Richard’s privacy claim against the BBC and was the highest profile privacy of the year.  The claimant was awarded damages of £210,000. We had a case preview and case reports on each day of the trial and posts from a number of commentators including Paul Wragg , Thomas Bennett ( first and second ), Jelena Gligorijević . The BBC subsequently announced that it would not seek permission to appeal.

  • ABC v Telegraph Media Group Ltd [2018] EWCA Civ 2329 .

This was perhaps the second most discussed privacy case of the year.  The Court of Appeal allowed the claimants’ appeal and granted an interim injunction to prevent the publication of confidential information about alleged “discreditable conduct” by a high profile executive.  Lord Hain subsequently named the executive as Sir Philip Green.  We had a case comment from Persephone Bridgman Baker. We also had comments criticising Lord Hain’s conduct from Paul Wragg , Robert Craig and Tom Double .

  • Ali v Channel 5 Broadcast ( [2018] EWHC 298 (Ch)) .

The claimants had featured in a “reality TV” programme about bailiffs, “Can’t pay? We’ll Take it Away”. Their claim for misuse of private information was successful and damages of £20,000 were awarded. We had a case comment from Zoe McCallum. An appeal and cross appeal was heard on 4 December 2018 and judgment is awaited.

  • NT1 and NT2 v Google Inc [2018] 3 WLR 1165.

This was the first “right to be forgotten” claim in the English Courts – with claims in both data protection and privacy. Both claimants had spent convictions – one was successful and the other not.  We had a case preview from Aidan Wills and a comment on the case from Iain Wilson,

  • Lloyd v Google LLC [2018] EWHC 2599 (QB) .

This was an attempt to bring a “representative action” in data protection on behalf of all iPhone users in respect of the “Safari Workaround”. The representative claimant was refused permission to serve Google out of the jurisdiction.  We had a case comment from Rosalind English.  There was a Panopticon Blog post the case. The claimant has been given permission to appeal and it is likely that the appeal will be heard in late 2019.

  • TLU v Secretary of State for the Home Department [2018] EWCA Civ 2217 .

The Court of Appeal dismissed an appeal in a “data leak” case on the issue of liability to individuals affected by a data leak but not specifically named in the leaked document. We had a case comment from Lorna Skinner and further comment from Iain Wilson.  There was also a Panopticon Blog post .

  • Stunt v Associated Newspapers [2018] EWCA Civ 170 .

The Court of Appeal referred the question of whether the “journalistic exemption” in section 32(4) of the Data Protection Act 1998 is compatible with the Data Protection Directive and the EU Charter of Fundamental Rights to the CJEU.  There was a Panopticon Blog post on the case.

  • Various Claimants v W M Morrison Supermarkets plc [2018] EWCA Civ 2339 .

The Court of Appeal upheld the decision of Langstaff J that Morrisons were vicariously liable for a mass data breach caused by the criminal act of a rogue employee. We had a case comment from Alex Cochrane.  There was a Panopticon Blog post the case.

  • Big Brother Watch v. Secretary of State [2018] ECHR 722 .

An important case in which the European Court of Human Rights held that secret surveillance regimes including the bulk interception of external communications violated Articles 8 and 10 of the Convention. We had a post by Graham Smith as to the implications of this decision for the present regime.

  • ML and WW v Germany ( [2018] ECHR 554 ). 

This was the first case in the European Court of Human Rights on the “right to be forgotten”. This was an application under Article in respect of the historic publication by the media of information concerning a murder conviction.  The application was dismissed.  We had a case comment from Hugh Tomlinson and Aidan Wills.  There was also a Panopticon blog post on the case.

Share this:

Caselaw , Data Protection , Privacy

2018 Top 10 Privacy and Data Protection Cases

' src=

January 29, 2019 at 6:25 am

Reblogged this on | truthaholics and commented: “In this post we round up some of the most legally and factually interesting privacy and data protection cases from England and Europe from the past year.”

' src=

January 29, 2019 at 9:38 am

Reblogged this on tummum's Blog .

' src=

February 2, 2019 at 12:27 am

Very Nice and informative data…keep the good work going on

3 Pingbacks

  • Top 10 Privacy and Data Protection Cases of 2020: a selection – Suneet Sharma – Inforrm's Blog
  • Top 10 Privacy and Data Protection Cases of 2021: A selection – Suneet Sharma – Inforrm's Blog
  • Top 10 Privacy and Data Protection Cases 2022, a selection – Suneet Sharma – Inforrm's Blog

Leave a Reply Cancel reply

case study data privacy

Contact the Inforrm Blog

Inforrm  can be contacted by email [email protected]

Email Subscription

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Email Address:

Sign me up!

Media Law Employment Opportunities

Penningtons Manches Cooper, Paralegal – Commercial Dispute Resolution (Reputation Management & Privacy)

Edwards Duthie Shamash, Media Law Associate, 3 – 5 years PQE

Schillings Senior Associate

Schillings Associate

Good Law Practice, Defamation Lawyer

Brett Wilson, NQ – 4 years’ PQE solicitor

Mishcon de Reya, Associate Reputation Protection, 1-4 PQE

Slateford, NQ – 2 years’ PQE solicitor

  • Bosnia and Herzegovina: Kovačević v Informer, Who do you sue? A Curious Bosnian Only-one-Defamer-Pays Theory - Igor Popović
  • Top 10 Defamation Cases 2022: a selection - Suneet Sharma
  • Law and Media Round Up - 29 July 2024
  • Top 10 Privacy and Data Protection Cases of 2021: A selection - Suneet Sharma

Recent Judgments

  • Artificial Intelligence
  • Bosnia Herzegovina
  • Broadcasting
  • Cybersecurity
  • Data Protection
  • Freedom of expression
  • Freedom of Information
  • Government and Policy
  • Human Rights
  • Intellectual Property
  • Leveson Inquiry
  • Media Regulation
  • New Zealand
  • Northern Ireland
  • Open Justice
  • Philippines
  • Phone Hacking
  • Social Media
  • South Africa
  • Surveillance
  • Uncategorized
  • United States

Search Inforrm’s Blog

  • Alternative Leveson 2 Project
  • Blog Law Online
  • Brett Wilson Media Law Blog
  • Canadian Advertising and Marketing Law
  • Carter-Ruck's News and Insights
  • Cearta.ie – The Irish for Rights
  • Centre for Internet and Society – Stanford (US)
  • Clean up the Internet
  • Cyberlaw Clinic Blog
  • Cyberleagle
  • Czech Defamation Law
  • David Banks Media Consultancy
  • Defamation Update
  • Defamation Watch Blog (Aus)
  • Droit et Technologies d'Information (France)
  • Fei Chang Dao – Free Speech in China
  • Guardian Media Law Page
  • Hacked Off Blog
  • Information Law and Policy Centre Blog
  • Internet & Jurisdiction
  • Internet Cases (US)
  • Internet Policy Review
  • Journlaw (Aus)
  • LSE Media Policy Project
  • Media Reform Coalition Blog
  • Media Report (Dutch)
  • Michael Geist – Internet and e-commerce law (Can)
  • Musings on Media (South Africa)
  • Paul Bernal's Blog
  • Press Gazette Media Law
  • Scandalous! Field Fisher Defamation Law Blog
  • Simon Dawes: Media Theory, History and Regulation
  • Social Media Law Bulletin (Norton Rose Fulbright)
  • Strasbourg Observers
  • Transparency Project
  • UK Constitutional Law Association Blog
  • Zelo Street

Blogs about Privacy and Data Protection

  • Canadian Privacy Law Blog
  • Data Matters
  • Data protection and privacy global insights – pwc
  • DLA Piper Privacy Matters
  • Données personnelles (French)
  • Europe Data Protection Digest
  • Mass Privatel
  • Norton Rose Fulbright Data Protection Report
  • Panopticon Blog
  • Privacy and Data Security Law – Dentons
  • Privacy and Information Security Law Blog – Hunton Andrews Kurth
  • Privacy Europe Blog
  • Privacy International Blog
  • Privacy Lives
  • Privacy News – Pogo was right
  • RPC Privacy Blog
  • The Privacy Perspective

Blogs about the Media

  • British Journalism Review
  • Jon Slattery – Freelance Journalist
  • Martin Moore's Blog
  • Photo Archive News

Blogs and Websites: General Legal issues

  • Carter-Ruck Legal Analysis Blog
  • Human Rights in Ireland
  • Human Rights Info
  • ICLR Case Commentary
  • Joshua Rozenberg Facebook
  • Law and Other Things (India)
  • Letters Blogatory
  • Mills and Reeve Technology Law Blog
  • Open Rights Group Blog
  • RPC's IP Hub
  • RPC's Tech Hub
  • SCOTUS Blog
  • The Court (Canadian SC)
  • The Justice Gap
  • UK Human Rights Blog
  • UK Supreme Court Blog

Court, Government, Regulator and Other Resource Sites

  • Australian High Court
  • Canadian Supreme Court
  • Commonwealth Legal Information Institute
  • Cour De Cassation France
  • European Data Protection Board
  • Full Fact.org
  • German Federal Constitutional Court
  • IMPRESS Project
  • Irish Supreme Court
  • New Zealand Supreme Court
  • NSW Case Law
  • Press Complaints Commission
  • Press Council (Australia)
  • Press Council (South Africa)
  • South African Constitutional Court
  • UK Judiciary
  • UK Supreme Court
  • US Supreme Court

Data Protection Authorities

  • Agencia Española de Protección de Datos (in Spanish)
  • BfDI (Federal Commissioner for Data Protection)(in German)
  • CNIL (France)
  • Danish Data Protection Agency
  • Data Protection Authority (Belgium)
  • Data Protection Commission (Ireland)
  • Dutch Data Protection Authority
  • Information Commissioner's Office
  • Italian Data Protection Authority
  • Scottish Information Commissioner
  • Swedish Data Protection Authority

Freedom of Expression Blogs and Sites

  • Backlash – freedom of sexual expression
  • Council of Europe – Freedom of Expression
  • EDRi – Protecting Digital Freedom
  • Free Word Centre
  • Freedom House Freedom of Expression
  • Freedom of Expression Institute (South Africa)
  • Guardian Freedom of Speech Page
  • Index on Censorship

Freedom of Information Blogs and Sites

  • All About Information (Can)
  • Campaign for Freedom of Information
  • David Higgerson
  • FreedomInfo.org
  • Open and Shut (Aus)
  • Open Knowledge Foundation Blog
  • The Art of Access (US)
  • The FOIA Blog (US)
  • The Information Tribunal
  • UCL Constitution Unit – FOI Resources
  • US Immigration, Freedom of Information Act and Privacy Act Facts
  • Veritas – Zimbabwe
  • Whatdotheyknow.com

Inactive and Less Active Blogs and Sites

  • #pressreform
  • Aaronovitch Watch
  • Atomic Spin
  • Bad Science
  • Banksy's Blog
  • Brown Moses Blog – The Hackgate Files
  • California Defamation Law Blog (US)
  • CYB3RCRIM3 – Observations on technology, law and lawlessness.
  • Data Privacy Alert
  • Defamation Lawyer – Dozier Internet Law
  • DemocracyFail
  • Entertainment & Media Law Signal (Canada)
  • Forty Shades of Grey
  • Greenslade Blog (Guardian)
  • Head of Legal
  • Heather Brooke
  • IBA Media Law and Freedom of Expression Blog
  • Information and Access (Aus)
  • Informationoverlord
  • ISP Liability
  • IT Law in Ireland
  • Journalism.co.uk
  • Korean Media Law
  • Legal Research Plus
  • Lex Ferenda
  • Media Law Journal (NZ)
  • Media Pal@LSE
  • Media Power and Plurality Blog
  • Media Standards Trust
  • Nied Law Blog
  • No Sleep 'til Brooklands
  • Press Not Sorry
  • Primly Stable
  • Responsabilidad En Internet (Spanish)
  • Socially Aware
  • Story Curve
  • Straight Statistics
  • Tabloid Watch
  • The IT Lawyer
  • The Louse and The Flea
  • The Media Blog
  • The Public Privacy
  • The Sun – Tabloid Lies
  • The Unruly of Law
  • UK FOIA Requests – Spy Blog
  • UK Freedom of Information Blog

Journalism and Media Websites

  • Campaign for Press and Broadcasting Freedom
  • Centre for Law, Justice and Journalism
  • Committee to Protect Journalists
  • Council of Europe – Platform to promote the protection of journalism and safety of journalists
  • ECREA Communication Law and Policy
  • Electronic Privacy Information Centre
  • Ethical Journalism Network
  • European Journalism Centre
  • European Journalism Observatory
  • Frontline Club
  • Hold the Front Page
  • International Federation of Journalists
  • Journalism in the Americas
  • Media Wise Trust
  • New Model Journalism – reporting the media funding revolution
  • Reporters Committee for Freedom of the Press
  • Reuters Institute for the Study of Journalism
  • Society of Editors
  • Sports Journalists Association
  • Spy Report – Media News (Australia)
  • The Hoot – the Media in the Sub-Continent

Law and Media Tweets

  • 1stamendment
  • DanielSolove
  • David Rolph
  • FirstAmendmentCenter
  • Guardian Media
  • Heather Brooke (newsbrooke)
  • humanrightslaw
  • Internetlaw
  • jonslattery
  • Kyu Ho Youm's Media Law Tweets
  • Leanne O'Donnell
  • Media Law Blog Twitter
  • Media Law Podcast
  • Siobhain Butterworth

Media Law Blogs and Websites

  • 5RB Media Case Reports
  • Ad IDEM – Canadian Media Lawyers Association
  • Entertainment and Sports Law Journal (ESLJ)
  • Gazette of Law and Journalism (Australia)
  • International Media Lawyers Association
  • Legalis.Net – Jurisprudence actualite, droit internet
  • Office of Special Rapporteur on Freedom of Expression – Inter American Commission on Human Rights
  • One Brick Court Cases
  • Out-law.com
  • EthicNet – collection of codes of journalism ethics in Europe
  • Handbook of Reuters Journalism
  • House of Commons Select Committee for Culture Media and Sport memoranda on press standards, privacy and libel

US Law Blogs and Websites

  • Above the Law
  • ACLU – Blog of Rights
  • Blog Law Blog (US)
  • Chilling Effects Weather Reports (US)
  • Citizen Media Law Project
  • Courthousenews
  • Entertainment and Law (US)
  • Entertainment Litigation Blog
  • First Amendment Center
  • First Amendment Coalition (US)
  • Free Expression Network (US)
  • Internet Cases – a blog about law and technology
  • Jurist – Legal News and Research
  • Legal As She Is Spoke
  • Media Law Prof Blog
  • Media Legal Defence Initiative
  • Newsroom Law Blog
  • Shear on Social Media Law
  • Student Press Law Center
  • Technology and Marketing Law Blog
  • The Hollywood Reporter
  • The Public Participation Project (Anti-SLAPP)
  • The Thomas Jefferson Centre for the Protection of Free Expression
  • The Volokh Conspiracy

US Media Blogs and Websites

  • ABA Media and Communications
  • Accuracy in Media Blog
  • Columbia Journalism Review
  • County Fair – a blog from Media Matters (US)
  • Fact Check.org
  • Media Gazer
  • Media Law – a blog about freedom of the press
  • Media Matters for America
  • Media Nation
  • Nieman Journalism Lab
  • Pew Research Center's Project for Excellence in Journalism
  • Regret the Error
  • Reynolds Journalism Institute Blog
  • Stinky Journalism.org
  • August 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • February 2010
  • January 2010

© 2024 Inforrm's Blog

Theme by Anders Norén — Up ↑

Discover more from Inforrm's Blog

Subscribe now to keep reading and get access to the full archive.

Type your email…

Continue reading

  • Latest News

Logo

  • Cryptocurrencies
  • White Papers

Ethics in Data Science: Handling Sensitive Information

In the era of big data, data science has emerged as a transformative field, driving innovation across various industries. However, with great power comes great responsibility. Data scientists often deal with sensitive information, including personal data, financial records, and health information, raising significant ethical concerns. Let’s delve into the ethical considerations in data science, focusing on the responsible handling of sensitive information.

Understanding Sensitive Information

Sensitive information refers to data that, if disclosed, could cause harm to individuals or organizations. This includes, but is not limited to:

Personal Identifiable Information (PII) : Names, addresses, Social Security numbers, and other information that can identify an individual.

Financial Information : Credit card numbers, bank account details, and transaction histories.

Health Information : Medical records, health insurance information, and genetic data.

Confidential Business Information : Trade secrets, business plans, and proprietary algorithms.

The Ethical Imperatives

Ethics in data science is not just about complying with laws and regulations; it’s about doing the right thing even when no one is watching. The following ethical principles are crucial for handling sensitive information responsibly:

Privacy : Respecting individuals' privacy by ensuring that their data is used and stored securely.

Transparency : Being open about data collection methods, purposes, and how data will be used.

Consent : Obtaining explicit consent from individuals before collecting or using their data.

Security : Implementing robust security measures to protect data from unauthorized access and breaches.

Fairness : Ensuring that data practices do not lead to discrimination or unfair treatment of individuals or groups.

Ethical Challenges in Data Science

Data scientists face numerous ethical challenges when handling sensitive information. These challenges include:

1. Informed Consent

One of the fundamental principles of ethics in data science is obtaining informed consent. However, obtaining true informed consent can be challenging. Individuals must be fully aware of what data is being collected, how it will be used, and the potential risks involved. This requires clear communication, which can be difficult to achieve, especially in complex data projects.

Case Study: Facebook-Cambridge Analytica Scandal

In 2018, it was revealed that Cambridge Analytica had harvested personal data from millions of Facebook profiles without users' consent and used it for political advertising. This incident highlighted the importance of informed consent and the need for transparency in data collection practices.

2. Data Anonymization and De-Identification

Anonymizing or de-identifying data is often used to protect individuals' privacy. However, de-identified data can sometimes be re-identified, especially when combined with other data sources. Ensuring that anonymization techniques are robust and that re-identification risks are minimized is crucial.

Example: The Netflix Prize Data Release

In 2006, Netflix released anonymized movie ratings data as part of a competition to improve its recommendation algorithm. However, researchers were able to re-identify individuals by cross-referencing the dataset with other publicly available information . This incident underscores the challenges of true anonymization.

3. Bias and Fairness

Data scientists must be vigilant about biases in their data and algorithms. Biases can lead to unfair treatment of certain groups and perpetuate existing inequalities. Ensuring that data is representative and that algorithms are tested for fairness is essential.

Example: Discriminatory Hiring Algorithms

Several companies have faced criticism for using hiring algorithms that discriminate against certain demographic groups. For instance, an algorithm trained on historical hiring data might favor candidates similar to those who have been hired in the past, perpetuating existing biases.

4. Data Security

Protecting sensitive data from breaches and unauthorized access is paramount. Data breaches can lead to significant harm, including identity theft, financial loss, and damage to reputation. Implementing robust security measures and regularly auditing data security practices is essential.

Case Study: Equifax Data Breach

In 2017, Equifax, one of the largest credit reporting agencies, suffered a data breach that exposed the personal information of 147 million people. The breach was attributed to weak security practices, highlighting the importance of strong data security measures.

Best Practices for Ethical Data Handling

To address these ethical challenges, data scientists and organizations should adopt the following best practices:

1. Establish Clear Data Governance Policies

Organizations should establish comprehensive data governance policies that outline how sensitive information will be collected, stored, used, and protected. These policies should be communicated clearly to all stakeholders and regularly reviewed and updated.

2. Implement Privacy by Design

Privacy by design is an approach that integrates privacy considerations into the development of systems and processes from the outset. This includes minimizing data collection, using anonymization techniques, and incorporating privacy safeguards into algorithms.

3. Conduct Ethical Impact Assessments

Before embarking on data projects, organizations should conduct ethical impact assessments to identify potential ethical issues and risks. These assessments should involve input from diverse stakeholders, including ethicists, legal experts, and representatives from affected groups.

4. Ensure Transparency and Accountability

Organizations should be transparent about their data practices and hold themselves accountable for ethical lapses. This includes providing clear information about data collection and usage practices, as well as mechanisms for individuals to access, correct, and delete their data.

5. Foster a Culture of Ethical Awareness

Ethics should be a core component of data science education and training. Organizations should foster a culture of ethical awareness by providing regular training on ethical issues and encouraging open discussions about ethical dilemmas.

Legal and Regulatory Frameworks

Various legal and regulatory frameworks govern the handling of sensitive information. Data scientists must be familiar with these frameworks and ensure compliance. Key regulations include:

1. General Data Protection Regulation (GDPR)

The GDPR is a comprehensive data protection regulation in the European Union that sets strict requirements for data handling, including obtaining consent, ensuring data security, and allowing individuals to access and control their data.

2. California Consumer Privacy Act (CCPA)

The CCPA is a data privacy law in California that gives residents the right to know what personal information is being collected about them, to whom it is being sold, and the ability to access and delete their data.

3. Health Insurance Portability and Accountability Act (HIPAA)

HIPAA is a U.S. law that establishes standards for protecting sensitive patient health information. It requires healthcare providers and other covered entities to implement safeguards to ensure the confidentiality, integrity, and availability of health information.

Emerging Trends and Future Directions

The field of data science is constantly evolving, and so too are the ethical challenges and considerations. Some emerging trends and future directions include:

1. AI Ethics and Governance

As AI and machine learning become more integrated into data science, there is a growing focus on AI ethics and governance. This includes developing frameworks for responsible AI development, ensuring transparency in AI decision-making, and addressing the societal impacts of AI.

2. Data Minimization

Data minimization is the practice of collecting only the data that is necessary for a specific purpose and retaining it only for as long as needed. This approach reduces the risk of data breaches and helps protect individuals' privacy.

3. Decentralized Data Models

Decentralized data models, such as federated learning, allow data to be processed locally on devices rather than being centralized in a single location. This approach can enhance privacy and security by keeping sensitive data on individuals' devices.

4. Ethical AI Toolkits

Several organizations are developing ethical AI toolkits that provide guidelines, frameworks, and tools for responsible AI development. These toolkits can help data scientists navigate ethical challenges and implement best practices.

Ethics in data science is a critical and evolving field that requires continuous attention and commitment. As data scientists handle increasingly sensitive information, they must navigate complex ethical landscapes to ensure that their practices are responsible, transparent, and fair. By adhering to ethical principles, implementing best practices, and staying informed about legal and regulatory requirements, data scientists can help build trust and ensure that their work benefits society as a whole. The journey towards ethical data science is ongoing, but with diligence and dedication, it is possible to navigate this path responsibly.

Related Stories

logo

Data Privacy Act of 2012: A Case Study Approach to Philippine Government Agencies Compliance

  • October 2018
  • Journal of Computational and Theoretical Nanoscience 24(10):7042-7046
  • 24(10):7042-7046

Bernie Fabito at National University (NU Lipa), Philippines

  • National University (NU Lipa), Philippines

Michelle Renee Domingo Ching at De La Salle University

  • De La Salle University

Nelson J Celis at De La Salle University

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations
  • David Paul Ramos
  • Angelis O. Arcilla
  • Adrian Kim V. Espallardo
  • Clio Andrei J. Gomez

Orland Delfino Tubola

  • Felicisimo V. Wenceslao Jr

Graeme Atherton

  • Roger Y. Chao. Jr

Lumer Jude Parreño Doce

  • Valentin III Calcia Dones

Mark Angel Bernardino Serra

  • Alyssa T. Bautista
  • Ailah Carelle M. Bunag
  • Mariz V. Menorca

Alfio Regla

  • Int J Soc Econ

Jon Sutinen

  • Andreas Seufert
  • Comput Law Secur Rep
  • Anna Romanou
  • Eur Manag J

Andrea Runfola

  • INT REV LAW ECON
  • Paul R. Zimmerman

France Belanger

  • Nelson J Celis
  • Manuel Wiesche
  • Michael Schermann
  • Helmut Krcmar
  • Les Mcmonagle
  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

Law Audience Journal (e-ISSN: 2581-6705)

You are currently viewing Big Data Privacy and Data Protection: A Case Study Analysis Under GDPR

Big Data Privacy and Data Protection: A Case Study Analysis Under GDPR

  • Post author: Varun Kumar
  • Post published: February 5, 2022
  • Post category: Volume 3 & Issue 3
  • Post comments: 0 Comments

Click here to download the full pa per (PDF)

Authored By: Ms. Nandini Tripathy (LL.M), Jindal Global Law School, O.P. Jindal Global University , Sonipat, Haryana ,

Click here for Copyright Policy.

Click here for Disclaimer .

“Big data is proven to be a valuable asset for many businesses, allowing for greater processes and business opportunities. Big data, on either hand, has given a lot of people more access to sensitive information that, when handled, could endanger people’s privacy and violate data protection. As a result, data controllers may face serious penalties, even bankruptcy, if they really do not conform. The volume of data processed, saved, and gathered is increasing rapidly as even more devices access the internet and each other, posing new data security problems [1] ”.

I. CHALLENGE OF BIG DATA AND SECURITY:

When dealing with “big data”, solutions do not meet the standards for ensuring security and privacy. Big data solutions frequently rely on traditional firewalls or application layer execution to limit access to information, but firewalls transport security layers; data source can be unknown, and anonymised data can be re-identified. For any of these reasons, advanced techniques are being introduced to verify, secure, and monitor huge amounts of data in areas such as infrastructure, security, and management. [2] According to the National Institute of Standards and Technology , cloud computing is a paradigm for providing internet connectivity to a pool of computing investments (e.g., networks, servers, storage, applications, and services) that can be quickly supplied and released with very little proper coordination or provider affiliation (NIST).

II. DATA PROTECTION:

A system typically contains a large volume of personal data or information that corporations can use to gain a competitive advantage. As a result, we should consider where the limit for the usage of such information is. So, in order to use data securely while also protecting privacy, we must first comprehend the privacy dangers connected with big data. While big data has many advantages for businesses of all kinds, it also poses a number of severe privacy issues, including: (1). Data Breach (2), Data Brokerage (3), and Data Discrimination are all examples of data breaches. [3]

III. REGULATORY FRAMEWORK IN INDIA:

The Information Technology Act of 2000 (“IT Act”), as well as the rules enacted under it, provides legal principles for data protection, covering data collection, storage, disclosure, and transfer. The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (the “SPDI Rules”) necessitate whatever organisation that processes, deals with, stores, or handles sensitive personal information in a computer system that it owns, controls, or operates to obey procedures and measures. Several other Indian laws, in addition to the IT Act and the SPDI Rules, can apply to data protection, depending on the entity collecting the data and the type of data collected. [4]

IV. REGULATORY FRAMEWORK IN THE EU:

The mere fact that a browser is available in the EU is inadequate to expose the business to GDPR; nevertheless, other evidence of a company’s intent to offer goods or services in the EU may be relevant. Companies that are not based in the EU but are subject to GDPR must appoint an EU representative for GDPR compliance reasons in written. Small-scale, non-sensitive processing of data on a routine basis is exceptions to the general rule.

The European Data Protection Board (EDPB) is an independent European organization charged with ensuring that data protection rules are applied uniformly across the EU. “The EDPB is established by the General Data Protection Regulation (GDPR) (GDPR)”. The European Data Protection Supervisor or national data protection authorities from EU/EEA countries are included on the EDPB. The European Commission engages in the operations and meetings of a Board but does not have a vote. The EDPS supplies the secretariat again for EDPB. The secretariat implements the Chair of the Board’s orders to the code. Personal data processed by an admin who has never had any units in the EU but has an organization in a nation in which the EU Member State’s public rules apply.

OECD Privacy Principle 3- Purpose Specification:

Personal data collection objectives should be specified no later than the time of collection, and future usage should be limited to achieving those goals or additional goals that are not incompatible with the primary purposes. [5]

OECD Privacy Principle 4: Use Limitation:

Personal data shall not be shared, made available, or used for purposes other than those mentioned in paragraph 9 of the OECD recommendation unless: a) with the consent of the data subject; or b) with legal authorisation (which would be related to the OECD privacy principle 3 – Purpose Specification).

OECD Privacy Principle 5:

Security Measures Loss or unauthorized access to private data, including its removal, usage, alteration, or publication, should all be safeguarded by adequate security safeguards.

OECD Privacy Principle 6: Openness:

It is necessary to implement a broad policy of accountability for personal data developments, procedures, and rules. The ability to show the existence and nature of private information, as well as the primary purposes for which it is used, as well as the identity and usual residency of the data controller, should all be made public.

OECD Privacy Principle 7: Individual Participation:

Individuals must be able to: a) obtain verification from a data principal, or other, that the data processor holds personal details concerning them. b) to have data on them communicated to them in the following manner: i. in a reasonable amount of time; ii. for a reasonable fee, if applicable; iii. in a reasonable manner; and iv. in a format that they can understand. c) to receive notification of the reasons for the demand .

V. BIG DATA INTEROPERABILITY FRAMEWORK:

On June 19, 2013, the NIST Big Data Public Working Group was established. (NBD-PWG) . A Working Group’s goal is to create a common Big Data framework (also known as the “NIST Big Data Interoperability Framework”) . The NIST Big Data Interoperability Framework consists of seven documents (referred to as “volumes”) that cover core Big Data concerns and were prepared by five NBD-PWG subgroups. The NIST Big Data volumes are still in written form and are directed largely at US businesses. [6]

The NIST approach must be evaluated against by the European Data Protection Strategy, that may have consequences for how privacy is managed. Security and Privacy Subgroup The NIST Big Data Security & Privacy Subgroup released the final draft of Volume 6 on Security and Privacy in September 2015, and it covers, among many other things, the following topics. Security and Privacy Taxonomies; NIST Big Data Reference Architecture Security and Privacy Fabric (NBDRA). [7]

VI. BIG DATA GOVERNANCE:

Information and communication technology advancements are fast connecting the surrounding environment with massive amounts of data. As a result of this occurrence, advanced data technology has emerged among scientists and engineers all over the world, especially in the study of big data. The importance of adhering to business standards and older industry laws grows as data technology progresses. To undertake impact analysis, a robust governance system must be able to audit additional data modifications, detect data offspring, and allow role-based data access. [8]

VII. CONCLUSION:

Big Data Analytics carries weight. Not only for companies, but also for politicians, regulators, and customers, Big Data Analytics has become more prominent. As a consequence, Big Data Analytics does have a lot of potential for both firms and customers. The new EU Regulation is a significant step forward in terms of allowing business innovation via Big Data Analytics. However, we believe that incorporating privacy and data protection principles into Big Data Analytics systems and projects from the start will enable businesses to not only fully exploit the capabilities of Big Data Analytics, but also to distinguish themselves from competitors who may face complaints and potential sanctions. [9]

As a result, in order to boost company efficiency and innovation, institutions must follow the ‘important privacy & data protection principles,’ as well as provide GDPR-compliant privacy and data protection, as well as user-friendly Big Data Analytics solutions, by properly applying controls. Continuously developing and constructing privacy and data protection controls. As a result, businesses must set up a privacy and data protection (risk) management system to identify, mitigate, and manage associated risks in line with their risk appetite. To accomplish this goal, new, innovative, and efficient solutions for securing personal data processed in the context of Big Data Analytics must be considered. To build solutions, legal and technological tools should be used in unison.

Cite this article as:

Ms. Nandini Tripathy, “ Big Data Privacy and Data Protection: A Case Study Analysis Under GDPR”, Vol.3 & Issue 3, Law Audience Journal (e-ISSN: 2581-6705), Pages 302 to 306 (5 th February 2022), available at https://www.lawaudience.com/big-data-privacy-and-data-protection-a-case-study-analysis-under-gdpr/ .

Footnotes & References:

[1] ISACA: Privacy and Big Data (2013). http://www.isaca.org (last accessed on 27th November 2021).

[2] World Economic Forum: Personal Data: The Emergence of a New Asset Class (2011), www.weforum.org/reports/personal-dataemergence-new-asset-class (last accessed on 27th November 2021).

[3] Katal, A., Wazid, M.: Big data: issues, challenges, tools and good practices. In: IEEE 6th International Conference on Contemporary Computing (IC3). Department of CSE, Graphic Era University, Dehradun, India (2013) (last accessed on 27th November 2021).

[4] Abid, M., Iynkaran, N., Yong, X.: Protection of big data privacy. IEEE Access 4 , 1821–1834 (2016).  https://doi.org/10.1109/ACCESS.2016.2558446 (last accessed on 27th November 2021).

[5] Pierre-Luc, D.: Privacy and social media in the Age of Big Data. House of Commons, Canada (2012) (last accessed on 27th November 2021).

[6] Hasan, O., Habegger, B., Brunie, L., Bennani, N., Damiani, E.: A discussion of privacy challenges in user profiling with big data techniques: the EEXCESS use case. In: 2013 IEEE International Congress on Big Data, Santa Clara, CA (2013) (last accessed on 27th November 2021).

[7] Gruschka, N., Mavroeidis, V., Vishi, K., Jensen, M.: Privacy issues and data protection in big data: a case study analysis under GDPR. In: 2018 IEEE International Conference on Big Data (Big Data), pp. 5027–5033. IEEE (2018) (last accessed on 27th November 2021).

[8] Ng, W.S., Wu, H., Wu, W., Xiang, S., Tan, K.L.: Privacy preservation in streaming data collection. In: IEEE 18th International Conference on Parallel and Distributed Systems. Institute for Infocomm Research, A*STAR, Singapore (2012) (last accessed on 27th November 2021).

[9] Bachlechner, D., La Fors, K., Sears, A.M.: The role of privacy-preserving technologies in the age of big data. In: WISP 2018 Proceedings, vol. 28 (2018) https://aisel.aisnet.org/wisp2018/28 (last accessed on 27th November 2021).

You Might Also Like

Read more about the article Right To Information vs. Official Secrecy: An Overview

Right To Information vs. Official Secrecy: An Overview

Read more about the article A Global Upcoming Issue: Impact of Use/Commercialization of Artificial Intelligence on Intellectual Property Rights

A Global Upcoming Issue: Impact of Use/Commercialization of Artificial Intelligence on Intellectual Property Rights

Read more about the article Child Sexual Abuse: Deciphering Indian Legal Position

Child Sexual Abuse: Deciphering Indian Legal Position

Leave a reply cancel reply.

You must be logged in to post a comment.

  • EDITORIAL BOARD MEMBERS
  • PUBLICATION ETHICS AND PUBLICATION MALPRACTICE STATEMENT
  • PUBLICATION POLICY
  • Article Processing Charges (APC)
  • WITHDRAWAL POLICY
  • PUBLICATION TIME TABLE
  • ADVISORY BOARD MEMBERS
  • SUBMIT YOUR PAPER
  • PUBLISHER DETAILS
  • Volume 5 & Issue 5
  • ONLINE INTERVIEW SERIES
  • Law Audience Blog
  • POLICY/LEGISLATION ANALYSIS
  • CASE ANALYSIS

WhatsApp us

zkMe

Client Case Study: Data Ownership Protocol (DOP)

zkMe

Introduction

zkMe is the web3 zero-knowledge (ZK) identity layer. zkMe builds Identity Oracles that leverage the power of zero-knowledge-proofs to enable secure, self-sovereign and private credential verifications. In this Customer Case Study series, we’ll explore how leading web3 teams use zkMe to elevate their security and regulatory compliance while simultaneously protecting their users identity.

DOP’s Privacy Dilemma: Balancing KYC Compliance with User Anonymity

Data Ownership Protocol (DOP) is the home of selective transparency — giving all of their users full control over the information that’s publicly available about their crypto balances and transaction histories,  has selected zkMe as its on-chain KYC provider . DOP, known for its innovative approach to selective transparency in blockchain transactions, was facing the daunting task of implementing robust KYC procedures without compromising their users’ privacy — a fundamental pillar of their service offering. The challenge was multifaceted: ensuring regulatory compliance, protecting user data from potential breaches common in centralized systems, and maintaining user anonymity in crypto transactions. These seemingly conflicting requirements created a complex problem that demanded an innovative solution.

The zkMe Solution: Decentralized KYC Reimagined

zkMe’s decentralized KYC solution emerged as the perfect answer to DOP’s complex challenges, offering a groundbreaking approach to user verification that aligns seamlessly with DOP’s commitment to selective transparency.

At the heart of zkMe’s solution lies the power of zero-knowledge proofs, a cryptographic method that allows for the verification of user credentials without exposing any personal information. This technology enables DOP to confirm a user’s identity while maintaining the highest levels of privacy, a cornerstone of their service.

Key features of zkMe’s solution include:

  • Decentralized Processing: The verification process occurs either directly on users’ devices or through a decentralized oracle network. This approach eliminates the need for centralized data storage, significantly reducing the risk of large-scale data breaches that often plague traditional KYC systems.
  • User Data Control: Empowering users with unprecedented control over their personal information, the system allows individuals to revoke access to their data at any time. This feature not only enhances user trust but also reinforces the autonomy that is central to DOP’s ethos.
  • Streamlined Onboarding: The innovative KYC process facilitates a smoother, more secure user onboarding experience.
  • Regulatory Compliance: Despite its privacy-first approach, zkMe’s solution meets stringent global regulatory standards. It complies with requirements set by the Financial Action Task Force (FATF) and various Anti-Money Laundering (AML) bodies, ensuring that DOP can operate within legal frameworks without compromising on user privacy.
  • “Verifiably Anonymous Until Proven Guilty”: This unique approach ensures that KYC documentation remains encrypted and inaccessible unless a government initiates “bad actor” proceedings. This feature provides an additional layer of protection for law-abiding users while still allowing for necessary legal actions when required.

The Transformation: Benefits of Implementing zkMe’s zkKYC Solution

The integration of zkMe’s zkKYC solution has brought about transformative changes for DOP, revolutionizing its approach to user privacy and regulatory compliance. By implementing this innovative KYC process, DOP has successfully reinforced its commitment to selective transparency while meeting all necessary regulatory requirements. Users can now complete their verification without fear of personal information exposure, significantly enhancing trust in the platform. This seamless, privacy-preserving onboarding process has not only improved the user experience but also attracted more privacy-conscious individuals to DOP’s ecosystem, which ultimately benefits DOP for its user base growth.

As mentioned in DOP’s partnership announcement  “DOP UNVEILS COLLABORATION WITH ZKME” , “there’s encouraging synergy between both of our projects. Just like we champion selective transparency, zkMe prioritizes selective disclosure by ensuring that precise details pertaining to someone’s identity remain protected at all times.”

Key Resources

  • zkMe Developer Documentation —  https://docs.zk.me/zkme-dochub
  • Book a  Meeting with zkMe

zkMe builds zk Identity Oracles for truly decentralized & anonymous cross-chain credential verifications.

No personal information is ever processed by anyone but the user themselves. Data leaks & misuse by the service provider are impossible; full interoperability & reusability result in a superior ID solution. zkMe’s is the only FATF compliant KYC provider to be fully decentralized, offering a full suite of products from anti-bit/anti-sybil, to KYC and more.

For more information, follow the links below:

Website  |  Twitter  |  Discord  |  Docs  |

The Straits Times

  • International
  • Print Edition
  • news with benefits
  • SPH Rewards
  • STClassifieds
  • Berita Harian
  • Hardwarezone
  • Shin Min Daily News
  • Tamil Murasu
  • The Business Times
  • The New Paper
  • Lianhe Zaobao
  • Advertise with us

Incidents of data leaks in S’pore public sector up 10%, with 201 cases recorded in 2023

case study data privacy

SINGAPORE – Incidents of data leaks within the public sector rose by 10 per cent to 201 cases in 2023, likely due to the increase in digital services, said the Ministry of Digital Development and Information (MDDI) on July 30.

Most of the incidents – 172 of them – were of low severity and had minimal impact on agencies, individuals or businesses, said MDDI in its annual report on the Government’s personal data protection efforts. Such cases have been steadily rising since 2021, when 126 low-severity incidents were recorded.

Medium-severity incidents, defined as those that pose minor inconveniences to those affected, fell to 29 cases in 2023, compared with 46 in 2022 .

This is partly due to a progressive roll-out of security measures and more awareness of data security within the public sector, said MDDI.

For the fourth year running, there have been no reported incidents classified as high, severe or very severe. These incidents refer to those that damage national security or the public’s confidence, or resulting in financial or emotional damage, or even death or serious injury.

Only two incidents classified as severe have been reported to date, both in 2018. The confidential data of 14,200 patients from the Ministry of Health’s HIV registry was leaked , and 223 case files from the State Courts’ online system were accessed without permission.

The annual report, detailing the number of data incidents and measures taken by the authorities to enhance security, is a key initiative of the Public Sector Data Security Review Committee to promote accountability and transparency.

The committee was formed in 2019 after a spate of high-profile cyber-security breaches, including Singapore’s worst data breach involving 1.5 million SingHealth patients’ data in June 2018.

MDDI said public sector organisations have fully implemented all 24 initiatives recommended by the committee between 2020 and 2024. The recommendations include enhancing cyber-security measures, processes to detect and respond to data incidents effectively, and raising awareness in safeguarding data.

Since March, government user accounts that are no longer needed are automatically removed as part of a Central Accounts Management system, built to reduce the risk of unauthorised access by former officers or exploitation by malicious actors.

Public officers have access to a Central Privacy Toolkit (Cloak) which anonymises datasets and helps generate mock data to help with sharing information within agencies or to train artificial intelligence (AI) without risking personal information.

The privacy-enhancing tool has anonymised 20 million documents and supported more than 20 generative AI use cases in the Government, said MDDI.

All public officers are also required to attend data security e-learning modules, which were refreshed in February to incorporate lessons on new technologies, like how to handle data when using government AI large language models.

MDDI will continue to scout for new capabilities to strengthen its data security measures, and review its policies and initiatives to keep them up to date.

Join ST's WhatsApp Channel and get the latest news and must-reads.

  • Ministry of Digital Development and Information
  • Data protection
  • Cyber security

Read 3 articles and stand to win rewards

Spin the wheel now

  • Work & Careers
  • Life & Arts
  • Currently reading: Business school teaching case study: can biodiversity bonds save natural habitats?
  • Business school teaching case study: can the fashion industry be more sustainable?
  • Business school teaching case study: Unilever chief signals rethink on ESG
  • Business school teaching case study: can green hydrogen’s potential be realised?
  • Business school teaching case study: how electric vehicles pose tricky trade dilemmas
  • Business school teaching case study: is private equity responsible for child labour violations?

Business school teaching case study: can biodiversity bonds save natural habitats?

case study data privacy

  • Business school teaching case study: can biodiversity bonds save natural habitats? on x (opens in a new window)
  • Business school teaching case study: can biodiversity bonds save natural habitats? on facebook (opens in a new window)
  • Business school teaching case study: can biodiversity bonds save natural habitats? on linkedin (opens in a new window)
  • Business school teaching case study: can biodiversity bonds save natural habitats? on whatsapp (opens in a new window)

Andrew Karolyi and John Tobin-de la Puente

Simply sign up to the Sustainability myFT Digest -- delivered directly to your inbox.

In June, the Colombian subsidiary of Spanish banking group BBVA announced that it was issuing what it described as the financial sector’s “first biodiversity bond”, in order to finance habitat conservation and restoration projects in the South American country. 

The $50mn initiative — backed by the International Finance Corporation (IFC), the private sector-focused arm of the World Bank, as structurer and investor — marks a turnaround for a nation recovering from half a century of violence and guerrilla activity. It also places Colombia among a select group of pioneers, including the Seychelles and Belize, that are using the financial markets to support the conservation of nature.

While the green bonds market has seen explosive growth in the past decade, the capital it has raised has overwhelmingly been invested in climate mitigation, alternative energy, and green transportation projects. Minimal amounts go to biodiversity conservation and habitat restoration projects. 

In financing nature, explicitly and directly, this Colombian bond breaks new ground, with metrics linked to objectives to benefit the environment. Invest ors will be repaid through a mix of funding sources including a carbon tax, the government budget and donors .

Test yourself

This is the sixth in a series of monthly business school-style teaching case studies devoted to responsible-business dilemmas faced by organisations. Read the piece and FT articles suggested at the end (and linked to within the piece) before considering the questions raised. 

About the authors: Andrew Karolyi is professor and dean, John Tobin-de la Puente is professor of practice and co-director of the Initiative on Responsible Finance, both at the Cornell SC Johnson College of Business.

The series forms part of a wide-ranging collection of FT ‘instant teaching case studies ’ that explore business challenges.

The question for those concerned about the destruction of the world’s natural habitats is whether this pioneering structured bond will be effective, and whether it could help to inspire a broader range of similar instruments aimed at countering loss of biodiversity around the world. 

Meanwhile, the question for investors is whether the vehicle is sufficiently attractive and robust to attract a new and growing class of funders that may share an interest in environmental issues but also seek competitive returns.

Located at the northern end of the Andes, Colombia straddles the Equator, the Pacific Ocean, the Caribbean, and the Amazon basin. It has the second-highest number of species on the planet after Brazil, and the highest species diversity when measured per square kilometre, according to the World Wildlife Fund . Colombia is home to more than 1,900 species of birds — on a par with Brazil and Peru.

Colombia will be on the frontline of biodiversity losses

But global warming threatens to cause dramatic harm to this biodiversity . Colombia will be on the frontline of these losses because it will be disproportionately affected by climate change compared to countries with fewer species that are more widespread.

Now, though, it could also be in the vanguard of new financial models to reverse the trend.

In 2016, a historic peace agreement between the government and leftist guerrilla group the Revolutionary Armed Forces of Colombia (Farc) marked the end of five decades of armed conflict. Despite continuing violence, the peace process has greatly improved the lives of citizens. However, it has also increased pressure on natural ecosystems. The political violence had meant large areas were shielded from illegal deforestation and degradation of the habitat.

Five years after the peace deal, Colombia became the first Latin American country to issue a green bond in its domestic market : a 10-year $200mn offering aiming to finance a variety of projects intended to benefit the environment — including water management, sustainable transport, biodiversity protection, and renewable energy. High investor demand meant the final amount had been increased by half again.

case study data privacy

Finance minister José Manuel Restrepo described the structured bond as an “important step” in finding new ways to finance investment in environmental projects: it would help develop a domestic green bond market and attract a wider range of investors. His ministry identified another $500mn in eligible projects that could be financed through green bonds, including a $50mn Colombian “blue bond” — financing focused on marine habitats and ocean-based projects that generate environmental co-benefits. This was successfully placed in 2023 with the help of BBVA and the IFC as structurer.

Now, the announcement of BBVA Colombia’s biodiversity bond marks another step forward. It focuses on reforestation, regeneration of natural forests on degraded land, mangrove conservation, and wildlife habitat protection.

In the case of green bonds, only a minuscule share of the money raised is spent on nature conservation, in part because few such projects generate cash flows from which to repay investors. Another reason is that it is harder to measure how effectively deployed resources dedicated to conservation — such as for monitoring species population growth — are, or to track activities that help to reach certain conservation target goals over time, such as for restoring degraded ecosystems.  

Using private, financial return-seeking capital to finance the sustainable management and conservation of natural resources is viewed by many experts as the most realistic solution to the twin crises of biodiversity loss and climate change — given the magnitude of investment needed. 

Yet there is growing political pushback against environmental and social initiatives, most notably in the US. 

Regulators and consumer groups have also launched legal actions to challenge green objectives. Large corporations, including Unilever, Bank of America and Shell, have in the past year dropped or missed goals to cut carbon emissions. And there has been disillusion with the ability of sustainability-linked bonds to meet their objectives. 

By association, that raises fresh questions about continued progress on biodiversity.

In biodiversity finance, doing deals is inherently more difficult

In tackling the climate crisis, the trajectory seems clear: the set of solutions needed is more or less agreed, and a good part of it makes economic sense. But, in biodiversity finance, doing deals is inherently more difficult.

It is more complex to structure transactions that generate proceeds to protect wildlife, restore ecosystems and fund other activities that may not generate cash flows, all while ensuring investors are repaid. Early successes — such as Belize’s blue bond are encouraging — but the potential for real scale is still unclear.

Questions for discussion

How companies are starting to back away from green targets (ft.com)

Green bond issuance surges as investors hunt for yield (ft.com)

Sustainability-linked bonds falter amid credibility concerns (ft.com)

Consider these questions:

1. How critical is the role of the IFC as structurer of the BBVA Colombia biodiversity bond deal in validating its legitimacy and providing investors with assurance? How important is it that IFC is also a co-investor in the biodiversity bond issuance?  

2. What are the pros and cons of the fact that the $50mn BBVA Colombia biodiversity bond deal has been launched following Colombia’s successful placement three years earlier of its sovereign green bond, and following its newly announced “green taxonomy”?  

3. What does the Colombian experience say about the likelihood of rapid change in how countries manage their biodiversity and climate impacts? Does Colombia demonstrate that such change is possible, or is its experience unique and unlikely to represent a model of rapid action for other countries?

4. Can biodiversity bonds meaningfully help to address biodiversity loss? And is this transaction the start of a trend? If not, why would BBVA Colombia have executed this transaction? Is it a gesture of goodwill and a recognition of its own corporate responsibility, or a means to greenwash some of its other less appealing investments?

5. Considering the economic and social context following the peace agreement between Colombia and the Farc forces, how might the shift from conflict to peace affect the country’s ability to balance economic development with environmental conservation?   

Promoted Content

Explore the series.

Women search for used clothes amid tons discarded in the Atacama desert

Follow the topics in this article

  • Sustainability Add to myFT
  • Impact investing Add to myFT
  • Climate change Add to myFT
  • Green bonds Add to myFT
  • Business school case Add to myFT

Discover CALS

See how our current work and research is bringing new thinking and new solutions to some of today's biggest challenges.

  • Agriculture
  • Applied Economics
  • Climate Change
  • Communication
  • Environment
  • Global Development
  • Health + Nutrition

Betts Farm Case Study – Cover Cropping in Concord Grape Vineyards

man holding soil in his hands

  • Cornell AgriTech
  • Lake Erie Research and Extension Lab
  • Viticulture and Enology
  • Horticulture

It started as an experiment… 

In 2011, Betts planted several middle-row alleys with winter tillage radish in an effort to alleviate soil compaction. Radishes are brassicas that form a thick taproot, like a carrot, and are known to break up soil and scavenge excess nitrate. When the radish dies, the large taproot decays to create soil pores that encourage water infiltration and gas exchange. Betts strategically planted in rows with new tile drainage to see if soil pores would channel water to the tile lines below.  The water infiltration improvement was evident, along with an unexpected benefit in the form of high biomass production, or the amount of living material generated from planting the cover crop (Figure 1). The Betts noticed more earthworm activity, soil stability, and a decrease in weed pest pressure in the middle rows where they planted the winter tillage radish. 

man holding soil in his hands

Figure 1: Bob Betts proudly shows a well aggregated clod from a cover cropped area (left) and a compacted clod from a non-cover cropped area (right).

The success prompted Betts to expand his cover crop repertoire in 2012 by seeding alternating bands of annual ryegrass and radish seven inches apart, for a total of nine bands per middle-row. The ryegrass was intended to complement the large holes left by decayed radishes. which allowed for the water infiltration that they were hoping for, but also created unstable ground for tractor access. Adding the ryegrass species, that has an extensive, soil-holding root system, helps to stabilize the ground during wet periods to allow for easier tractor access. The experiment wasn’t entirely successful, as the radishes crowded out the ryegrass, a common occurrence if radish seeding rate is too high (one extra pound of radish seed per acre can make a huge difference) or there is high residual nitrogen in the soil. 

Fortunately, financial help arrived in the form of the  Environmental Quality Incentive Program (EQIP), a National Resource Conservation Service (NRCS) subsidy funding cover crop efforts to combat erosion and improve soil health. Betts took advantage of their prescribed mixes of three to seven different seed species, and worked with the Lake Erie Regional Grape Program (LERGP) to alternate mixed planting with fallow plots three panels long by three rows wide (24 X 9ft) to serve as an experimental control. This experiment block has been ongoing for eleven years.  

The Betts Farms cover crop program has helped address many practical concerns, including erosion, summer moisture retention, and weed suppression. Betts has further innovated by initiating use of a 5-foot-wide I & J roller crimper in 2015 to terminate cover crops in June, an uncommon practice in Concord grape vineyards. Rolling the aboveground portion of the cover crop protects the soil from rain droplet impact, while cover crop roots hold soil in place during periods of intense rainfall (Figure 2). This decreases the runoff and erosion that may carry pesticides, valuable nutrients, and topsoil away from the grapevines. The roller treatment also addresses the concern that cover crops might compete with vines for soil moisture during times of drought, as the biomass mat created in early June shades the ground, retaining soil moisture. The mat also results in cooler surface temperatures, creating better soil microbe habitat than hot, dry, bare soil. 

Both the actively growing cover crop and the biomass mat help suppress weeds, which is especially important for problematic annual species like Marestail ( Condyza canadensis ), a plant commonly resistant to glyphosate (Roundup). While Marestail was a significant problem in Betts Farms’ control rows, it was rare in cover cropped rows that had been rolled and crimped (Figure 2), which reduced his overall reliance on glyphosate. The biomass mat provided adequate weed prevention most years, and if it doesn’t, herbicide can be applied as needed.  

field with rain-soaked grass (left), green bushy plants growing between vines (right)

Figure 2: A mat of rolled cover crop biomass protects soil during an intense rainfall event that delivered 5 inches in 2 hours on July 14th 2015 (left), cover crops reduce weed growth, as evident here: Marestail (Condyza canadensis) grows in control area, but not in the cover cropped portion behind it (right).

Soil Health Benefits

In addition to the ‘above ground’ benefits, visual inspection makes it obvious that life below ground has improved, too. Earthworms, nature’s plows, are increasingly prevalent. As earthworms eat, soil and decomposing organic matter are mixed together in their gut, then deposited as ‘casts’—stable assemblages of organic and mineral particles atop their burrows. These casts are more fertile than the surrounding soil and help increase nutrient availability for the shallow-rooted grapevines. Betts has also noticed increased lateral vine root growth in areas of increased earthworm activity.

To confirm the empirical observation of improved soil health, Betts worked with Cornell’s New York Soil Health Initiative in May 2021 to collect four composite soil samples from the cover crop and non-cover crop control treatments for a standard soil health assessment at the Cornell Soil Health Lab. Six, 0-6”soil slices were taken as composite samples from two locations within the experimental area, classified as a Barcelona silt loam, which is comprised of approximately 13% sand, 60% silt, and 27% clay. 

The soil samples from the cover-cropped plots had consistently higher soil respiration (27%) and aggregate stability (58%) compared to the non-cover cropped plots (Table 1, Figure 2). Higher soil respiration indicates that cover crop biomass inputs are fueling soil microbe conversion of organic residues into mineral-accessible nutrients, such as nitrate and ammonium, faster than in the controlled plots. Higher aggregate stability measurements confirmed that the soil under cover crops was much better aggregated compared to the non-cover cropped, more compacted, soil (Table 1, Figure 3). Living roots, their associated mycorrhizal fungi (AMF), and increased organic matter all help build and maintain stable aggregates, which in turn support greater water infiltration and reduced topsoil erosion. This is evident in a comparison of respiration and aggregate stability values and soil health scores for the Betts Farms treatments compared to pastures and perennial fruit (orchards and vineyards) on silt loam soils in New York (Figure 3). No significant differences were observed in soil organic matter and active carbon, which may be due to high initial levels of soil organic matter and inherent site variability. There is an indication that cover crops make phosphorous (P) and potassium (K) more available, which could help increase vine productivity. 

table with data on cover crop versus non-cover crop soil compositions and chemistries

Table 1: Cover crop (CC) vs. non-cover crop control (NCC) treatment effect for the Betts Farms vineyard in 2021. These values reflect the mean of two composite soil samples per treatment. The abbreviations in the table are as follows: Treatment (Trt), Soil Organic Matter (SOM), Respiration (Resp), Aggregate Stability (Agg Stab), Phosphorus (P), Potassium (K), Magnesium (Mg), Iron (Fe), Soil Health score (SH score), Cover Crop Treatment (CC), Non-Cover Crop Treatment (NCC).

charts with markings and standard deviation bars to show CO2 respiration (left) and percent aggregate stability (right)

Figure 3: Soil health benchmarking of Betts Farms soil respiration (a) and aggregate stability (b) compared to other pastures and perennial fruit systems on silt loam soils in NYS. 

Vine Productivity Benefits

The ultimate test of any management system is its effect on productivity, and farmers dream of finding a win-win solution that both improves soil health and increases crop yield. Through pruning weight measurements taken between 2019-2021, vines in cover cropped plots where shown to have consistently higher pruning weights than control plots (Table 2). Pruning weights measure the annual growth removed from dormant vines as an indicator of larger vine size and potential crop yield, so higher pruning weights demonstrate that cover crops have improved soil health and nutrient availability, in turn supporting better vine growth. Conversely, loss in vine size would have indicated that cover crops competed with vines for water and nutrients. To verify this effect, crop yield data will be analyzed over the coming years to assess pruning weight trends.  

table with data on pruning weights for 3 years in cover cropped vs. non-cover cropped areas

Table 2: Pruning weights for cover cropped and non-cover cropped areas between 2019-2021.

Bob Betts began his farm’s cover cropping trial in an attempt to reduce soil compaction, but found that it also led to improved soil health and vine productivity. This case study was instrumental in securing additional funding for Betts Farms, working with the Cornell Lake Erie Research and Extension Laboratory, NRCS, New York Soil Health Initiative at Cornell, and the New York Farm Viability Institute, to explore further research efforts designed to improve vineyard soil health and achieve farm goals. Stay tuned for more exciting vineyard cover crop innovations!

Jennifer Phillips Russo is an extension associate and viticulture specialist for Cornell Cooperative Extension. She is part of the Cornell Lake Erie Research Extension Laboratory (CLEREL) and serves as team leader for the Lake Erie Regional Grape Program . Bob Betts is the fourth-generation farmer and owner of Betts Farm in Westfield, NY. He is passionate about multi-species cover cropping research. Joseph Amsili is an extension associate and program coordinator with the Cornell Soil Health Program and New York Soil Health Initiative.

Subscription Background

We openly share valuable knowledge.

Sign up for more insights, discoveries and solutions.

IMAGES

  1. PPT

    case study data privacy

  2. Privacy Preserving Data Analysis: Case Studies

    case study data privacy

  3. Case Study: Compliant data privacy plans & processes

    case study data privacy

  4. The Last Mile in Performing Privacy Impact Assessments Effectively

    case study data privacy

  5. (PDF) Usable security and privacy: A case study of developing privacy

    case study data privacy

  6. Ten Tips for Implementing Data Privacy

    case study data privacy

VIDEO

  1. Case Study: Data Driven business with SAP Datasphere

  2. Case Study Right to Privacy 2.0

  3. BANK LOAN CASE STUDY || DATA ANALYTICS || TRAINITY ||

  4. Data Science Interview

  5. (Mastering JMP) Visualizing and Exploring Data

  6. Kress Independent AGZA AFTC Testing

COMMENTS

  1. Top 10 Privacy and Data Protection Cases of 2021: A selection

    Inforrm covered a wide range of data protection and privacy cases in 2021. Following my posts in 2018, 2019 and 2020 here is my selection of most notable privacy and data protection cases across 2021:. Lloyd v Google LLC [2021] UKSC 50 In the most significant privacy law judgment of the year the UK Supreme Court considered whether a class action for breach of s4(4) Data Protection Act 1998 ...

  2. PDF A Case Study of the Capital One Data Breach

    New data protection and privacy laws and recent cyber security regulations, such as the General Data Protection Regulation (GDPR) that went into effect in Europe in 2018, demonstrate a strong trend and growing concern on how to protect businesses and customers from the significant increase in cyber-attacks. ... A Case Study of the Capital One ...

  3. Case Studies: High-Profile Cases of Privacy Violation

    The billing information of 9,000 consumers was also compromised. The settlement: After years of litigation, the case was heard before the U.S. Court of Appeals for the Eleventh Circuit. LabMD argued, in part, that data security falls outside of the FTC's mandate over unfair practices.

  4. Top 10 Privacy and Data Protection Cases 2022

    Inforrm covered a wide range of data protection and privacy cases in 2022. Following my posts in 2018, 2019, 2020 and 2021 here is my selection of notable privacy and data protection cases across 2022. ZXC v Bloomberg [2022] UKSC 5 This was the seminal privacy case of the year, decided by the UK Supreme Court. It was considered whether, in general…

  5. A Case Study on the Development of a Data Privacy Management Solution

    This use case includes some of the attributes present in the proposed taxonomic definition: privacy, represented by the data which the patient grants access to and is registered in the systems and the wearable IoT device; user, represented by the patient and the nurse; environment, represented by the screening room; device, represented by the ...

  6. Privacy, Ethics, and Data Access: A Case Study of the Fragile Families

    In this article, we provide a case study addressing this common tension in an uncommon setting: the Fragile Families Challenge, a scientific mass collaboration designed to yield insights that could improve the lives of disadvantaged children in the United States. ... This case study describes the privacy and ethics audit that we conducted as ...

  7. (PDF) Privacy Issues and Data Protection in Big Data: A Case Study

    Priv acy Issues and Data Protection in Big Data: A. Case Study Analysis under GDPR. Nils Gruschka †, V asileios Mavroeidis †, Kamer V ishi †, Meiko Jensen ∗. † Research Group of ...

  8. Privacy Issues and Data Protection in Big Data: A Case Study Analysis

    In addition, we present and analyse two real-life research projects as case studies dealing with sensitive data and actions for complying with the data regulation laws. We show which types of information might become a privacy risk, the employed privacy-preserving techniques in accordance with the legal requirements, and the influence of these ...

  9. A Case Study on the Development of a Data Privacy Management ...

    Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

  10. Case studies and examples

    Our data sharing code provides real-world examples and case studies of different approaches to data sharing, including where organisations found innovative ways to share data while protecting people's information. Here are some case studies additional to those in the code. Data sharing to improve outcomes for disadvantaged children and families.

  11. PDF Understanding Data Privacy Protections Across Industries

    Case study #1: Salesforce Integrating data privacy best practices into the Salesforce platform UI ... data privacy questions and concerns on: civil rights and liberties, privacy design, and potential legislation. Key insights: • Many participants expressed the challenge of attaining consent by

  12. (PDF) Privacy Issues and Data Protection in Big Data: A Case Study

    This paper describes privacy issues in big data analysis and. elaborates on two case studies (government-funded projects 2,3 ) in order to elucidate how legal priv acy requirements can be met. in ...

  13. Top 10 Privacy and Data Protection Cases of 2018: a selection

    The British Broadcasting Corporation [2018] EWHC 1837 (Ch). This was Sir Cliff Richard's privacy claim against the BBC and was the highest profile privacy of the year. The claimant was awarded damages of £210,000. We had a case preview and case reports on each day of the trial and posts from a number of commentators including Paul Wragg ...

  14. PDF Forum Guide to Education Data Privacy: Case Studies

    Case Studies. This document presents 11 case studies highlighting common practices in schools that may jeopardize student privacy. Each of the 11 case studies includes the following sections: A scenario (or vignette) that depicts common situations in many districts and exemplifies the critical issue discussed in the case study. A statement of ...

  15. Ethics in Data Science: Handling Sensitive Information

    Case Study: Equifax Data Breach. In 2017, Equifax, one of the largest credit reporting agencies, suffered a data breach that exposed the personal information of 147 million people. ... The CCPA is a data privacy law in California that gives residents the right to know what personal information is being collected about them, to whom it is being ...

  16. Privacy Issues and Data Protection in Big Data: A Case Study Analysis

    This paper describes privacy issues in big data analysis and elaborates on two case studies (government-funded projects2,3) in order to elucidate how legal privacy requirements can be met in research projects working on big data and highly sensitive personal information. Finally, it discusses resulted impacts on

  17. Data Privacy Act of 2012: A Case Study Approach to Philippine

    The study was a form of a qualitative case study following the context of (R. K. Yin, Case Study Research (2014)) study of research designs and methods. The case study is the recommended approach ...

  18. H1, 2024 Healthcare Data Breach Report

    H1, 2024 Healthcare Data Breach Report. Posted By Steve Alder on Jul 30, 2024. Several major healthcare cyberattacks have been reported in the first half of 2024, including a ransomware attack on Ascension that took its electronic medical record system out of action for a month and a ransomware attack on Change Healthcare that caused massive disruption for providers across the country due to ...

  19. Big Data Privacy and Data Protection: A Case Study Analysis Under GDPR

    While big data has many advantages for businesses of all kinds, it also poses a number of severe privacy issues, including: (1). Data Breach (2), Data Brokerage (3), and Data Discrimination are all examples of data breaches.[3] III. REGULATORY FRAMEWORK IN INDIA:

  20. Client Case Study: Data Ownership Protocol (DOP)

    Introduction zkMe is the web3 zero-knowledge (ZK) identity layer. zkMe builds Identity Oracles that leverage the power of zero-knowledge-proofs to enable secure, self-sovereign and private credential verifications. In this Customer Case Study series, we'll explore how leading web3 teams use zkMe to elevate their security and regulatory compliance while

  21. Incidents of data leaks in S'pore public sector up 10%, with 201 cases

    The committee was formed in 2019 after a spate of high-profile cyber-security breaches, including Singapore's worst data breach involving 1.5 million SingHealth patients' data in June 2018.

  22. #SIPAUC 2024 NTT DATA with OIT Case Study Session

    Case Study There's No Such Thing as an IT Project This presentation explores the critical role of human behavior and organizational change management (OCM) in the success of IT projects. It emphasizes the importance of understanding human reactions to change and integrating OCM to improve citizen interactions with government services.

  23. Business school teaching case study: can biodiversity bonds save

    This is the sixth in a series of monthly business school-style teaching case studies devoted to responsible-business dilemmas faced by organisations. Read the piece and FT articles suggested at ...

  24. Betts Farm Case Study

    Betts Farms, owned by Bob, Dawn, and Thom Betts, has been on the forefront of vineyard middle-row soil management for the past decade, continuously improving soil health on their 185 acres of Concord vines. In contrast to standard Concord viticulture practice, where middle-row grass and weeds are burned in late spring to ensure optimum vine fertility, growers like the Betts are establishing ...