The Research on ownership of cell therapy products

The Research on ownership of cell therapy products

1. Issues concerning ownership of cell therapy products

  Regarding the issue of ownership interests, American Medical Association(AMA)has pointed out in 2016 that using human tissues to develop commercially available products raises question about who holds property rights in human biological materials[1]. In United States, there have been several disputes concern the issue of the whether the donor of the cell therapy can claim ownership of the product, including Moore v. Regents of University of California(1990)[2], Greenberg v. Miami Children's Hospital Research Institute(2003)[3], and Washington University v. Catalona(2007)[4]. The courts tend to hold that since cells and tissues were donated voluntarily, the donors had already lost their property rights of their cells and tissues at the time of the donation. In Moore case, even if the researchers used Moore’s cells to obtain commercial benefits in an involuntary situation, the court still held that the property rights of removed cells were not suitable to be claimed by their donor, so as to avoid the burden for researcher to clarify whether the use of cells violates the wishes of the donors and therefore decrease the legal risk for R&D activities. United Kingdom Medical Research Council(MRC)also noted in 2019 that the donated human material is usually described as ‘gifts’, and donors of samples are not usually regarded as having ownership or property rights in these[5]. Accordingly, both USA and UK tends to believe that it is not suitable for cell donors to claim ownership.

2. The ownership of cell therapy products in the lens of Taiwan’s Civil Code

  In Taiwan, Article 766 of Civil Code stipulated: “Unless otherwise provided by the Act, the component parts of a thing and the natural profits thereof, belong, even after their separation from the thing, to the owner of the thing.” Accordingly, many scholars believe that the ownership of separated body parts of the human body belong to the person whom the parts were separated from. Therefore, it should be considered that the ownership of the cells obtained from the donor still belongs to the donor. In addition, since it is stipulated in Article 406 of Civil Code that “A gift is a contract whereby the parties agree that one of the parties delivers his property gratuitously to another party and the latter agrees to accept it.”, if the act of donation can be considered as a gift relationship, then the ownership of the cells has been delivered from donor to other party who accept it accordingly.

  However, in the different versions of Regenerative Medicine Biologics Regulation (draft) proposed by Taiwan legislators, some of which replace the term “donor” with “provider”. Therefore, for cell providers, instead of cell donors, after providing cells, whether they can claim ownership of cell therapy product still needs further discussion.

  According to Article 69 of the Civil Code, it is stipulated that “Natural profits are products of the earth, animals, and other products which are produced from another thing without diminution of its substance.” In addition, Article 766 of the Civil Code stipulated that “Unless otherwise provided by the Act, the component parts of a thing and the natural profits thereof, belong, even after their separation from the thing, to the owner of the thing.” Thus, many scholars believe that when the product is organic, original substance and the natural profits thereof are all belong to the owner of the original substance. For example, when proteins are produced from isolated cells, the proteins can be deemed as natural profits and the ownership of proteins and isolated cells all belong to the owner of the cells[6].

  Nevertheless, according to Article 814 of the Civil Code, it is stipulated that “When a person has contributed work to a personal property belonging to another, the ownership of the personal property upon which the work is done belongs to the owner of the material thereof. However, if the value of the contributing work obviously exceeds the value of the material, the ownership of the personal property upon which the work is done belongs to the contributing person.” Thus, scholar believes that since regenerative medical technology, which induces cell differentiation, involves quite complex biotechnology technology, and should be deemed as contributing work. Therefore, the ownership of cell products after contributing work should belongs to the contributing person[7]. Thus, if the provider provides the cells to the researcher, after complex biotechnology contributing work, the original ownership of the cells should be deemed to have been eliminated, and there is no basis for providers to claim ownership.

  However, since the development of cell therapy products involves a series of R&D activities, it still need to be clarified that who is entitled to the ownership of the final cell therapy products. According to Taiwan’s Civil Code, the ownership of product after contributing work should belongs to the contributing person. However, when there are numerous contributing persons, which person should the ownership belong to, might be determined on a case-by-case basis.

3. Conclusion

  The biggest difference between cell therapy products and all other small molecule drugs or biologics is that original cell materials are provided by donors or providers, and the whole development process involves numerous contributing persons. Hence, ownership disputes are prone to arise.

  In addition to the above-discussed disputes, United Kingdom Co-ordinating Committee on Cancer Research(UKCCCR)also noted that there is a long list of people and organizations who might lay claim to the ownership of specimens and their derivatives, including the donor and relatives, the surgeon and pathologist, the hospital authority where the sample was taken, the scientists engaged in the research, the institution where the research work was carried out, the funding organization supporting the research and any collaborating commercial company. Thus, the ultimate control of subsequent ownership and patent rights will need to be negotiated[8].

  Since the same issues might also occur in Taiwan, while developing cell therapy products, carefully clarifying the ownership between stakeholders is necessary for avoiding possible dispute.

 

 

[1]American Medical Association [AMA], Commercial Use of Human Biological Materials, Code of Medical Ethics Opinion 7.3.9, Nov. 14, 2016, https://www.ama-assn.org/delivering-care/ethics/commercial-use-human-biological-materials  (last visited Jan. 3, 2021).

[2]Moore v. Regents of University of California, 793 P.2d 479 (Cal. 1990)

[3]Greenberg v. Miami Children's Hospital Research Institute, 264 F. Suppl. 2d, 1064 (SD Fl. 2003)

[4]Washington University v. Catalona, 490 F 3d 667 (8th Cir. 2007)

[5]Medical Research Council [MRC], Human Tissue and Biological Samples for Use in Research: Operational and Ethical Guidelines, 2019, https://mrc.ukri.org/publications/browse/human-tissue-and-biological-samples-for-use-in-research/ (last visited Jan. 3, 2021).

[6]Wen-Hui Chiu, The legal entitlement of human body, tissue and derivatives in civil law, Angle Publishing, 2016, at 327.

[7]id, at 341.

[8]Okano, M., Takebayashi, S., Okumura, K., Li, E., Gaudray, P., Carle, G. F., & Bliek, J. UKCCCR guidelines for the use of cell lines in cancer research. Cytogenetic and Genome Research, 86(3-4), 1999, https://europepmc.org/backend/ptpmcrender.fcgi?accid=PMC2363383&blobtype=pdf  (last visited Jan. 3, 2021).

Links
Download
※The Research on ownership of cell therapy products,STLI, https://stli.iii.org.tw/en/article-detail.aspx?no=55&tp=2&i=168&d=8610 (Date:2024/03/01)
Quote this paper
You may be interested
Hard Law or Soft Law? –Global AI Regulation Developments and Regulatory Considerations

Hard Law or Soft Law? –Global AI Regulation Developments and Regulatory Considerations 2023/08/18 Since the launch of ChatGPT on November 30, 2022, the technology has been disrupting industries, shifting the way things used to work, bringing benefits but also problems. Several law suits were filed by artists, writers and voice actors in the US, claiming that the usage of copyright materials in training generative AI violates their copyright.[1] AI deepfake, hallucination and bias has also become the center of discussion, as the generation of fake news, false information, and biased decisions could deeply affect human rights and the society as a whole.[2] To retain the benefits of AI without causing damage to the society, regulators around the world have been accelerating their pace in establishing AI regulations. However, with the technology evolving at such speed and uncertainty, there is a lack of consensus on which regulation approach can effectively safeguard human rights while promoting innovation. This article will provide an overview of current AI regulation developments around the world, a preliminary analysis of the pros and cons of different regulation approaches, and point out some other elements that regulators should consider. I. An overview of the current AI regulation landscape around the world The EU has its lead in legislation, with its parliament adopting its position on the AI ACT in June 2023, heading into trilogue meetings that aim to reach an agreement by the end of this year.[3] China has also announced its draft National AI ACT, scheduled to enter its National People's Congress before the end of 2023.[4] It already has several administration rules in place, such as the 2021 regulation on recommendation algorithms, the 2022 rules for deep synthesis, and the 2023 draft rules on generative AI.[5] Some other countries have been taking a softer approach, preferring voluntary guidelines and testing schemes. The UK published its AI regulation plans in March, seeking views on its sectoral guideline-based pro-innovation regulation approach.[6] To minimize uncertainty for companies, it proposed a set of regulatory principles to ensure that government bodies develop guidelines in a consistent manner.[7] The US National Institute of Standards and Technology (NIST) released the AI Risk Management Framework in January[8], with a non-binding Blueprint for an AI Bill of Rights published in October 2022, providing guidance on the design and use of AI with a set of principles.[9] It is important to take note that some States have drafted regulations on specific subjects, such as New York City’s Final Regulations on Use of AI in Hiring and Promotion came into force in July 2023.[10] Singapore launched the world’s first AI testing framework and toolkit international pilot in May 2022, with the assistance of AWS, DBS Bank, Google, Meta, Microsoft, Singapore Airlines, etc. After a year of testing, it open-sourced the software toolkit in July 2023, to better develop the system.[11] There are also some countries still undecided on their regulation approach. Australia commenced a public consultation on its AI regulatory framework proposal in June[12], seeking views on its draft AI risk management approach.[13] Taiwan’s government announced in July 2023 to propose a draft AI basic law by September 2023, covering topics such as AI-related definition, privacy protections, data governance, risk management, ethical principles, and industrial promotion.[14] However, the plan was recently postponed, indicating a possible shift towards voluntary or mandatory government principles and guidance, before establishing the law.[15] II. Hard law or soft law? The pros and cons of different regulatory approaches One of the key advantages of hard law in AI regulation is its ability to provide binding legal obligations and legal enforcement mechanisms that ensure accountability and compliance.[16] Hard law also provides greater legal certainty, transparency and remedies for consumers and companies, which is especially important for smaller companies that do not have as many resources to influence and comply with fast-changing soft law.[17] However, the legislative process can be time-consuming, slower to update, and less agile.[18] This poses the risk of stifling innovation, as hard law inevitably cannot keep pace with the rapidly evolving AI technology.[19] In contrast, soft law represents a more flexible and adaptive approach to AI regulation. As the potential of AI still remains largely mysterious, government bodies can formulate principles and guidelines tailored to the regulatory needs of different industry sectors.[20] In addition, if there are adequate incentives in place for actors to comply, the cost of enforcement could be much lower than hard laws. Governments can also experiment with several different soft law approaches to test their effectiveness.[21] However, the voluntary nature of soft law and the lack of legal enforcement mechanisms could lead to inconsistent adoption and undermine the effectiveness of these guidelines, potentially leaving critical gaps in addressing AI's risks.[22] Additionally, in cases of AI-related harms, soft law could not offer effective protection on consumer rights and human rights, as there is no clear legal obligation to facilitate accountability and remedies.[23] Carlos Ignacio Gutierrez and Gary Marchant, faculty members at Arizona State University (ASU), analyzed 634 AI soft law programs against 100 criteria and found that two-thirds of the program lack enforcement mechanisms to deliver its anticipated AI governance goals. He pointed out that credible indirect enforcement mechanisms and a perception of legitimacy are two critical elements that could strengthen soft law’s effectiveness.[24] For example, to publish stem cell research in top academic journals, the author needs to demonstrate that the research complies with related research standards.[25] In addition, companies usually have a greater incentive to comply with private standards to avoid regulatory shifts towards hard laws with higher costs and constraints.[26] III. Other considerations Apart from understanding the strengths and limitations of soft law and hard law, it is important for governments to consider each country’s unique differences. For example, Singapore has always focused on voluntary approaches as it acknowledges that being a small country, close cooperation with the industry, research organizations, and other governments to formulate a strong AI governance practice is much more important than rushing into legislation.[27] For them, the flexibility and lower cost of soft regulation provide time to learn from industries to prevent forming rules that aren’t addressing real-world issues.[28] This process allows preparation for better legislation at a later stage. Japan has also shifted towards a softer approach to minimize legal compliance costs, as it recognizes its slower position in the AI race.[29] For them, the EU AI Act is aiming at regulating Giant Tech companies, rather than promoting innovation.[30] That is why Japan considers that hard law does not suit the industry development stage they’re currently in.[31] Therefore, they seek to address legal issues with current laws and draft relevant guidance.[32] IV. Conclusion As the global AI regulatory landscape continues to evolve, it is important for governments to consider the pros and cons of hard law and soft law, and also country-specific conditions in deciding what’s suitable for the country. Additionally, a regular review on the effectiveness and impact of their chosen regulatory approach on AI’s development and the society is recommended. [1] ChatGPT and Deepfake-Creating Apps: A Running List of Key AI-Lawsuits, TFL, https://www.thefashionlaw.com/from-chatgpt-to-deepfake-creating-apps-a-running-list-of-key-ai-lawsuits/ (last visited Aug 10, 2023); Protection for Voice Actors is Artificial in Today’s Artificial Intelligence World, The National Law Review, https://www.natlawreview.com/article/protection-voice-actors-artificial-today-s-artificial-intelligence-world (last visited Aug 10, 2023). [2] The politics of AI: ChatGPT and political bias, Brookings, https://www.brookings.edu/articles/the-politics-of-ai-chatgpt-and-political-bias/ (last visited Aug 10, 2023); Prospect of AI Producing News Articles Concerns Digital Experts, VOA, https://www.voanews.com/a/prospect-of-ai-producing-news-articles-concerns-digital-experts-/7202519.html (last visited Aug 10, 2023). [3] EU AI Act: first regulation on artificial intelligence, European Parliament, https://www.europarl.europa.eu/news/en/headlines/society/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence (last visited Aug 10, 2023). [4] 中國國務院發布立法計畫 年內審議AI法草案,經濟日報(2023/06/09),https://money.udn.com/money/story/5604/7223533 (last visited Aug 10, 2023). [5] id [6] A pro-innovation approach to AI regulation, GOV.UK, https://www.gov.uk/government/publications/ai-regulation-a-pro-innovation-approach/white-paper (last visited Aug 10, 2023). [7] id [8] AI RISK MANAGEMENT FRAMEWORK, NIST, https://www.nist.gov/itl/ai-risk-management-framework (last visited Aug 10, 2023). [9] The White House released an ‘AI Bill of Rights’, CNN, https://edition.cnn.com/2022/10/04/tech/ai-bill-of-rights/index.html (last visited Aug 10, 2023). [10] New York City Adopts Final Regulations on Use of AI in Hiring and Promotion, Extends Enforcement Date to July 5, 2023, Littler https://www.littler.com/publication-press/publication/new-york-city-adopts-final-regulations-use-ai-hiring-and-promotionv (last visited Aug 10, 2023). [11] IMDA, Fact sheet - Open-Sourcing of AI Verify and Set Up of AI Verify Foundation (2023), https://www.imda.gov.sg/-/media/imda/files/news-and-events/media-room/media-releases/2023/06/7-jun---ai-annoucements---annex-a.pdf (last visited Aug 10, 2023). [12] Supporting responsible AI: discussion paper, Australia Government Department of Industry, Science and Resources,https://consult.industry.gov.au/supporting-responsible-ai (last visited Aug 10, 2023). [13] Australian Government Department of Industry, Science and Resources, Safe and responsible AI in Australia (2023), https://storage.googleapis.com/converlens-au-industry/industry/p/prj2452c8e24d7a400c72429/public_assets/Safe-and-responsible-AI-in-Australia-discussion-paper.pdf (last visited Aug 10, 2023). [14] 張璦,中央通訊社,AI基本法草案聚焦隱私保護、應用合法性等7面向 擬設打假中心,https://www.cna.com.tw/news/ait/202307040329.aspx (最後瀏覽日:2023/08/10)。 [15] 蘇思云,中央通訊社,2023/08/01,鄭文燦:考量技術發展快應用廣 AI基本法延後提出,https://www.cna.com.tw/news/afe/202308010228.aspx (最後瀏覽日:2023/08/10)。 [16] supra, note 13, at 27. [17] id. [18] id., at 28. [19] Soft law as a complement to AI regulation, Brookings, https://www.brookings.edu/articles/soft-law-as-a-complement-to-ai-regulation/ (last visited Aug 10, 2023). [20] supra, note 5. [21] Gary Marchant, “Soft Law” Governance of Artificial Intelligence (2019), https://escholarship.org/uc/item/0jq252ks (last visited Aug 10, 2023). [22] How soft law is used in AI governance, Brookings,https://www.brookings.edu/articles/how-soft-law-is-used-in-ai-governance/ (last visited Aug 10, 2023). [23] supra, note 13, at 27. [24] Why Soft Law is the Best Way to Approach the Pacing Problem in AI, Carnegie Council for Ethics in International Affairs,https://www.carnegiecouncil.org/media/article/why-soft-law-is-the-best-way-to-approach-the-pacing-problem-in-ai (last visited Aug 10, 2023). [25] id. [26] id. [27] Singapore is not looking to regulate A.I. just yet, says the city-state’s authority, CNBC,https://www.cnbc.com/2023/06/19/singapore-is-not-looking-to-regulate-ai-just-yet-says-the-city-state.html#:~:text=Singapore%20is%20not%20rushing%20to,Media%20Development%20Authority%2C%20told%20CNBC (last visited Aug 10, 2023). [28] id. [29] Japan leaning toward softer AI rules than EU, official close to deliberations says, Reuters, https://www.reuters.com/technology/japan-leaning-toward-softer-ai-rules-than-eu-source-2023-07-03/ (last visited Aug 10, 2023). [30] id. [31] id. [32] id.

Blockchain and General Data Protection Regulation (GDPR) compliance issues (2019)

Blockchain and General Data Protection Regulation (GDPR) compliance issues (2019) I. Brief   Blockchain technology can solve the problem of trust between data demanders and data providers. In other words, in a centralized mode, data demanders can only choose to believe that the centralized platform will not contain the false information. However, in the decentralized mode, data isn’t controlled by one individual group or organization[1], data demanders can directly verify information such as data source, time, and authorization on the blockchain without worrying about the correctness and authenticity of the data.   Take the “immutable” for example, it is conflict with the right to erase (also known as the right to be forgotten) in the GDPR.With encryption and one-time pad (OTP) technology, data subjects can make data off-chain storaged or modified at any time in a decentralized platform, so the problem that data on blockchain not meet the GDPR regulation has gradually faded away. II. What is GDPR?   The purpose of the EU GDPR is to protect user’s data and to prevent large-scale online platforms or large enterprises from collecting or using user’s data without their permission. Violators will be punished by the EU with up to 20 million Euros (equal to 700 million NT dollars) or 4% of the worldwide annual revenue of the prior financial year.   The aim is to promote free movement of personal data within the European Union, while maintaining adequate level of data protection. It is a technology-neutral law, any type of technology which is for processing personal data is applicable.   So problem about whether the data on blockchain fits GDPR regulation has raise. Since the blockchain is decentralized, one of the original design goals is to avoid a large amount of centralized data being abused.   Blockchain can be divided into permissioned blockchains and permissionless blockchains. The former can also be called “private chains” or “alliance chains” or “enterprise chains”, that means no one can join the blockchain without consent. The latter can also be called “public chains”, which means that anyone can participate on chain without obtaining consent.   Sometimes, private chain is not completely decentralized. The demand for the use of blockchain has developed a hybrid of two types of blockchain, called “alliance chain”, which not only maintains the privacy of the private chain, but also maintains the characteristics of public chains. The information on the alliance chain will be open and transparent, and it is in conflict with the application of GDPR. III. How to GDPR apply to blockchain ?   First, it should be determined whether the data on the blockchain is personal data protected by GDPR. Second, what is the relationship and respective responsibilities of the data subject, data controller, and data processor? Finally, we discuss the common technical characteristics of blockchain and how it is applicable to GDPR. 1. Data on the blockchain is personal data protected by GDPR?   First of all, starting from the technical characteristics of the blockchain, blockchain technology is commonly decentralized, anonymous, immutable, trackable and encrypted. The other five major characteristics are immutability, authenticity, transparency, uniqueness, and collective consensus.   Further, the blockchain is an open, decentralized ledger technology that can effectively verify and permanently store transactions between two parties, and can be proved.   It is a distributed database, all users on the chain can access to the database and the history record, also can directly verify transaction records. Each nodes use peer-to-peer transmission for upload or transfer information without third-party intermediation, which is the unique “decentralization” feature of the blockchain.   In addition, the node or any user on the chain has a unique and identifiable set of more than 30 alphanumeric addresses, but the user may choose to be anonymous or provide identification, which is also a feature of transparency with pseudonymity[2]; Data on blockchain is irreversibility of records. Once the transaction is recorded and updated on the chain, it is difficult to change and is permanently stored in the database, that is to say, it has the characteristics of “tamper-resistance”[3].   According to Article 4 (1) of the GDPR, “personal data” means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.   Therefore, if data subject cannot be identified by the personal data on the blockchain, that is an anonymous data, excluding the application of GDPR. (1) What is Anonymization?   According to Opinion 05/2014 on Anonymization Techniques by Article 29 Data Protection Working Party of the European Union, “anonymization” is a technique applied to personal data in order to achieve irreversible de-identification[4].   And it also said the “Hash function” of blockchain is a pseudonymization technology, the personal data is possible to be re-identified. Therefore it’s not an “anonymization”, the data on the blockchain may still be the personal data stipulated by the GDPR.   As the blockchain evolves, it will be possible to develop technologies that are not regulated by GDPR, such as part of the encryption process, which will be able to pass the court or European data protection authorities requirement of anonymization. There are also many compliance solutions which use technical in the industry, such as avoiding transaction data stored directly on the chain. 2. International data transmission   Furthermore, in accordance with Article 3 of the GDPR, “This Regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union, regardless of whether the processing takes place in the Union or not. This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: (a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union; or (b) the monitoring of their behaviour as far as their behaviour takes place within the Union”.[5]   In other words, GDPR applies only when the data on the blockchain is not anonymized, and involves the processing of personal data of EU citizens. 3. Identification of data controllers and data processors   Therefore, if the encryption technology involves the public storage of EU citizens' personal data and passes it to a third-party controller, it may be identified as the “data controller” under Article 4 of GDPR, and all nodes and miners of the platform may be deemed as the “co-controller” of the data, and be assumed joint responsibility with the data controller by GDPR. For example, the parties can claim the right to delete data from the data controller.   In addition, a blockchain operator may be identified as a “processor”, for example, Backend as a Service (BaaS) products, the third parties provide network infrastructure for users, and let users manage and store personal data. Such Cloud Services Companies provide online services on behalf of customers, do not act as “data controllers”. Some commentators believe that in the case of private chains or alliance chains, such as land records transmission, inter-bank customer information sharing, etc., compared to public chain applications: such as cryptocurrencies (Bitcoin for example), is not completely decentralized, and more likely to meet GDPR requirements[6]. For example, in the case of a private chain or alliance chain, it is a closed platform, which contains only a small number of trusted nodes, is more effective in complying with the GDPR rules. 4. Data subject claims   In accordance with Article 17 of the GDPR, The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay under some grounds.   Off-chain storage technology can help the blockchain industry comply with GDPR rules, allowing offline storage of personal data, or allow trusted nodes to delete the private key of encrypted information, which leaving data that cannot be read and identified on the chain. If the data is in accordance with the definition of anonymization by GDPR, there is no room for GDPR to be applied. IV. Conclusion   In summary, it’s seem that the application of blockchain to GDPR may include: (a) being difficulty to identified the data controllers and data processors after the data subject upload their data. (b) the nature of decentralized storage is transnational storage, and Whether the country where the node is located, is meets the “adequacy decision” of Article 45 of the GDPR.   If it cannot be met, then it needs to consider whether it conforms to the transfers subject to appropriate safeguards of Article 46, or the derogations for specific situations of Article 49 of the GDPR. Reference: [1] How to Trade Cryptocurrency: A Guide for (Future) Millionaires, https://wikijob.com/trading/cryptocurrency/how-to-trade-cryptocurrency [2] DONNA K. HAMMAKER, HEALTH RECORDS AND THE LAW 392 (5TH ED. 2018). [3] Iansiti, Marco, and Karim R. Lakhani, The Truth about Blockchain, Harvard Business Review 95, no. 1 (January-February 2017): 118-125, available at https://hbr.org/2017/01/the-truth-about-blockchain [4] Article 29 Data Protection Working Party, Opinion 05/2014 on Anonymisation Techniques (2014), https://www.pdpjournals.com/docs/88197.pdf [5] Directive 95/46/EC (General Data Protection Regulation), https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN [6] Queen Mary University of London, Are blockchains compatible with data privacy law? https://www.qmul.ac.uk/media/news/2018/hss/are-blockchains-compatible-with-data-privacy-law.html

The Study of Estonian Human Genes Database

I. Introduction The human genes database or human genome project, the product under the policy of biotechnology no matter in a developed or developing country, has been paid more attention by a government and an ordinary people gradually. The construction of human genes database or human genome project, which is not only related to a country’s innovation on biotechnology, but also concerns the promotion of a country’s medical quality, the construction of medical care system, and the advantages brought by the usage of bio-information stored in human genes database or from human genome project. However, even though every country has a high interest in setting up human genes database or performing human genome project, the issues concerning the purposes of related biotechnology policies, the distribution of advantages and risks and the management of bio-information, since each country has different recognition upon human genes database or human genome project and has varied standards of protecting human basic rights, there would be a totally difference upon planning biotechnology policies or forming the related systems. Right now, the countries that vigorously discuss human genes database or practice human genome project include England, Iceland, Norway, Sweden, Latvia and Estonia. Estonia, which is the country around the Baltic Sea, has planned to set up its own human genes database in order to draw attention from other advanced countries, to attract intelligent international researchers or research groups, and to be in the lead in the area of biotechnology. To sum up, the purpose of constructing Estonian human genes database was to collect the genes and health information of nearly 70% Estonia’s population and to encourage bio-research and promote medical quality. II. The Origin of Estonian Human Genes Database The construction of Estonian human genes database started from Estonian Genome Project (EGP). This project was advocated by the professor of biotechnology Andres Metspalu at Tartu University in Estonia, and he proposed the idea of setting up Estonian human genes database in 1999. The purposes of EGP not only tried to make the economy of Estonia shift from low-cost manufacturing and heavy industry to an advanced technological economy, but also attempted to draw other countries’ attention and to increase the opportunity of making international bio-researches, and then promoted the development of biotechnology and assisted in building the system of medical care in Estonia. EGP started from the agreement made between Estonian government and Eesti Geenikeskus (Estonian Genome Foundation) in March, 1999. Estonian Genome Foundation was a non-profit organization formed by Estonian scientists, doctors and politicians, and its original purposes were to support genes researches, assist in proceeding any project of biotechnology and to set up EGP. The original goals of constructing EGP were “(a) reaching a new level in health care, reduction of costs, and more effective health care, (b) improving knowledge of individuals, genotype-based risk assessment and preventive medicine, and helping the next generation, (c) increasing competitiveness of Estonia – developing infrastructure, investments into high-technology, well-paid jobs, and science intensive products and services, (d) [constructing] better management of health databases (phenotype/genotype database), (e) … [supporting]… economic development through improving gene technology that opens cooperation possibilities and creates synergy between different fields (e.g., gene technology, IT, agriculture, health care)”1. III. The Way of Constructing Estonian Human Genes Database In order to ensure that Estonian human genes database could be operated properly and reasonably in the perspectives of law, ethics and society in Estonia, the Estonian parliament followed the step of Iceland to enact “Human Genes Research Act” (HGRA) via a special legislative process to regulate its human genes database in 2000. HGRA not only authorizes the chief processor to manage Estonian human genes database, but also regulates the issues with regard to the procedure of donation, the maintenance and building of human genes database, the organization of making researches, the confidential identity of donator or patient, the discrimination of genes, and so on. Since the construction of Estonian human genes database might bring the conflicts of different points of view upon the database in Estonia, in order to “avoid fragmentation of societal solidarity and ensure public acceptability and respectability”2 , HGRA adopted international standards regulating a genes research to be a norm of maintaining and building the database. Those standards include UNESCO Universal Declaration on the Human Genome and Human Rights (1997) and the Council of Europe’s Convention on Human Rights and Biomedicine (1997). The purpose of enacting HGRA is mainly to encourage and promote genes researches in Estonia via building Estonian human genes database. By means of utilizing the bio-information stored in the database, it can generate “more exact and efficient drug development, new diagnostic tests, improved individualized treatment and determination of risks of the development of a disease in the future”3 . In order to achieve the above objectives, HGRA primarily puts emphasis on several aspects. Those aspects include providing stronger protection on confidential identity of donators or patients, caring for their privacy, ensuring their autonomy to make donations, and avoiding any possibility that discrimination may happen because of the disclosure of donators’ or patients’ genes information. 1.HERBERT GOTTWEIS & ALAN PETERSEN, BIOBANKS – GOVERNANCE IN COMPARATIVE PERSPECTIVE 59 (2008). 2.Andres Rannamae, Populations and Genetics – Legal and Socio-Ethical Perspectives, in Estonian Genome Porject – Large Scale Health Status Description and DNA Collection 18, 21 (Bartha Maria Knoppers et al. eds., 2003. 3.REMIGIUS N. NWABUEZE, BIOTECHNOLOGY AND THE CHALLENGE OF PROPERTY – PROPERTY RIGHTS IN DEAD BODIES, BODY PARTS, AND GENETIC INFORMATION, 163 (2007).

Post Brexit – An Update on the United Kingdom Privacy Regime

Post Brexit – An Update on the United Kingdom Privacy Regime 2021/9/10   After lengthy talks, on 31 January 2020, the United Kingdom (‘UK’) finally exited the European Union (‘EU’). Then, the UK shifted into a transition period. The UK government was bombarded with questions from all stakeholders. In particular, the data and privacy industry yelled out the loudest – what am I going to do with data flowing from the EU to the UK? Privacy professionals queried – would the UK have a new privacy regime that significantly departs from the General Data Protection Regulation (‘GDPR’)? Eventually, the UK made a compromise with all stakeholders – the British, the Europeans and the rest of the world – by bridging its privacy laws with the GDPR. On 28 June 2021, the UK obtained an adequacy decision from the EU.[1] This was widely anticipated but also widely known to be delayed, as it was heavily impacted by the aftermaths of the invalidation of the US- EU Privacy Shield.[2]   While the rest of the world seems to silently observe the transition undertaken by the UK, post-Brexit changes to the UK’s privacy regime is not only a domestic or regional matter, it is an international matter. Global supply chains and cross border data flows will be affected, shuffling the global economy into a new order. Therefore, it is crucial as citizens of a digital economy to unpack and understand the current UK privacy regime. This paper intends to give the reader a brief introduction to the current privacy regime of the UK. The author proposes to set out the structure of the UK privacy legislation, and to discuss important privacy topics. This paper only focuses on the general processing regime, which is the regime that is most relevant to general stakeholders. UK Privacy Legislation   There are two main privacy legislation in the UK – the Data Protection Act 2018 (‘DPA’) and the United Kingdom General Data Protection Act (‘UK GDPR’). These two acts must be read together in order to form a coherent understanding of the current UK privacy regime.   The UK GDPR is the creature of Brexit. The UK government wanted a smooth transition out of the EU and acknowledged that they needed to preserve the GDPR in their domestic privacy regime to an extent that would allow them to secure an adequacy decision. The UK government also wanted to create less impact on private companies. Thus, the UK GDPR was born. Largely it aligns closely with the GDPR, supplemented by the DPA. ICO   The Information Commissioner’s Office (‘ICO’) is the independent authority supervising the compliance of privacy laws in the UK. Prior to Brexit, the ICO was the UK’s supervisory authority under the GDPR. A unique feature of the ICO’s powers and functions is that it adopts a notice system. The ICO has power to issue four types of notices: information notices, assessment notices, enforcement notices and penalty notices.[3] The information notice requires controllers or processors to provide information. The ICO must issue an assessment notice before conducting data protection audits. Enforcement is only exercisable by giving an enforcement notice. Administrative fines are only exercisable by giving a penalty notice. Territorial Application   Section 207(1A) of the DPA states that the DPA applies to any controller or processor established in the UK, regardless where the processing of personal data takes place. Like the GDPR, the DPA and the UK GDPR have an extraterritorial reach to overseas controllers or processors. The DPA and the UK GDPR apply to overseas controllers or processors who process personal data relating to data subjects in the UK, and the processing activities are related to the offering of goods or services, or the monitoring of data subjects’ behavior.[4] Transfers of Personal Data to Third Countries   On 28 June 2021, the UK received an adequacy decision from the EU.[5] This means that until 27 June 2025, data can continue to flow freely between the UK and the European Economic Area (‘EEA’).   As for transferring personal data to third countries other than the EU, the UK has similar laws to the GDPR. Both the DPA and the UK GDPR restrict controllers or processors from transferring personal data to third countries. A transfer of personal data to a third country is permitted if it is based on adequacy regulations.[6] An EU adequacy decision is known as ‘adequacy regulations’ under the UK regime.   If there is no adequacy regulations, then a transfer of personal data to a third country will only be permitted if it is covered by appropriate safeguards, including standard data protection clauses, binding corporate rules, codes of conduct, and certifications.[7] The ICO intends to publish UK standard data protection clauses in 2021.[8] In the meantime, the EU has published a new set of standard data protection clauses (‘SCCs’).[9] However, it must be noted that the EU SCCs are not accepted to be valid in the UK, and may only be used for reference purposes. It is also worth noting that the UK has approved three certification schemes to assist organizations in demonstrating compliance to data protection laws.[10] Lawful Bases for Processing   Basically, the lawful bases for processing in the UK regime are the same as the GDPR. Six lawful bases are set out in article 6 of the UK GDPR. To process personal data, at least one of the following lawful bases must be satisfied:[11] The data subject has given consent to the processing; The processing is necessary for the performance of a contract; The processing is necessary for compliance with a legal obligation; The processing is necessary to protect vital interests of an individual – that is, protecting an individual’s life; The processing is necessary for the performance of a public task; The processing is necessary for the purpose of legitimate interests, unless other interests or fundamental rights and freedoms override those legitimate interests. Rights & Exemptions   The UK privacy regime, like the GDPR, gives data subjects certain rights. Most of the rights granted under the UK privacy regime is akin to the GDPR and can be found under the UK GDPR. Individual rights under the UK privacy regime is closely linked with its exemptions, this may be said to be a unique feature of the UK privacy regime which sets it apart from the GDPR. Under the DPA and the UK GDPR, there are certain exemptions, meaning organizations are exempted from certain obligations, most of them are associated with individual rights. For example, if data is processed for scientific or historical research purposes, or statistical purposes, organizations are exempted from provisions on the right of access, the right to rectification, the right to restrict processing and the right to object in certain circumstances.[12] Penalties   The penalty for infringement of the UK GDPR is the amount specified in article 83 of the UK GDPR.[13] If an amount is not specified, the penalty is the standard maximum amount.[14] The standard maximum amount, at the time of writing, is £8,700,000 (around 10 million Euros) or 2% of the undertaking’s total annual worldwide turnover in the preceding financial year.[15] In any other case, the standard maximum amount is £8,700,000 (around 10 million Euros).[16] Conclusion   The UK privacy regime closely aligns with the GDPR. However it would be too simple of a statement to say that the UK privacy regime is almost identical to the GDPR. The ICO’s unique enforcement powers exercised through a notice system is a distinct feature of the UK privacy regime. Recent legal trends show that the UK while trying to preserve its ties with the EU is gradually developing an independent privacy persona. The best example is that in regards to transfers to third countries, the UK has developed its first certification scheme and is attempting to develop its own standard data protection clauses. The UK’s transition out of the EU has certainly been interesting; however, the UK’s transformation from the EU is certainly awaited with awe. [1] Commission Implementing Decision of 28.6.2021, pursuant to Regulation (EU) 2016/679 of the European Parliament and of the Council on the adequate protection of personal data by the United Kingdom, C(2021) 4800 final,https://ec.europa.eu/info/sites/default/files/decision_on_the_adequate_protection_of_personal_data_by_the_united_kingdom_-_general_data_protection_regulation_en.pdf.. [2] Judgment of 16 July 2020, Data Protection Commissioner v. Facebook Ireland Limited, Maximillian Schrems, C-311/18, EU:C:2020:559, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:62018CJ0311. [3] Data Protection Act 2018, §115. [4] Data Protection Act 2018, §207(1A); REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), art 3. [5] supra note 1. [6] Data Protection Act 2018, §17A-18; REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), art 44-50. [7] Data Protection Act 2018, §17A-18; REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), art 46-47. [8]International transfers after the UK exit from the EU Implementation Period, ICO, https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/international-transfers-after-uk-exit/ (last visited Sep. 10, 2021). [9] Standard contractual clauses for international transfers, European Commission, https://ec.europa.eu/info/law/law-topic/data-protection/international-dimension-data-protection/standard-contractual-clauses-scc/standard-contractual-clauses-international-transfers_en (last visited Sep. 10, 2021). [10] ICO, New certification schemes will “raise the bar” of data protection in children’s privacy, age assurance and asset disposal, ICO, Aug. 19, 2021, https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2021/08/ico-approves-the-first-uk-gdpr-certification-scheme-criteria/ (last visited Sep. 10, 2021). [11] REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), art 6(1)-(2); Lawful basis for processing, ICO, https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/lawful-basis-for-processing/ (last visited Sep. 10, 2021). [12] Data Protection Act 2018, sch 2, part 6, para 27. [13] id. at §157. [14] id. [15] id. [16] id.

TOP