The Institutionalization of the Taiwan Personal Data Protection Committee - Triumph of Digital Constitutionalism: A Legal Positivism Analysis
2023/07/13
The Legislative Yuan recently passed an amendment to the Taiwan Personal Data Protection Act, which resulted in the institutionalization of the Taiwan Personal Data Protection Commission (hereunder the “PDPC”)[1]. This article aims to analyze the significance of this institutionalization from three different perspectives: legal positivism, digital constitutionalism, and Millian liberalism. By examining these frameworks, we can better understand the constitutional essence of sovereignty, the power dynamics among individuals, businesses, and governments, and the paradox of freedom that the PDPC addresses through governance and trust.
I.Three Layers of Significance
1.Legal Positivism
The institutionalization of the PDPC fully demonstrates the constitutional essence of sovereignty in the hands of citizens. Legal positivism emphasizes the importance of recognizing and obeying (the sovereign, of which it is obeyed by all but does not itself obey to anyone else, as Austin claims) laws that are enacted by legitimate authorities[2]. In this context, the institutionalization of the PDPC signifies the recognition of citizens' rights to control their personal data and the acknowledgment of the sovereign in protecting their privacy. It underscores the idea that the power to govern personal data rests with the individuals themselves, reinforcing the principles of legal positivism regarding sovereign
Moreover, legal positivism recognizes the authority of the state in creating and enforcing laws. The institutionalization of the PDPC as a specialized commission with the power to regulate and enforce personal data protection laws represents the state's recognition of the need to address the challenges posed by the digital age. By investing the PDPC with the authority to oversee the proper handling and use of personal data, the state acknowledges its responsibility to protect the rights and interests of its citizens.
2.Digital Constitutionalism
The institutionalization of the PDPC also rebalances the power structure among individuals, businesses, and governments in the digital realm[3]. Digital constitutionalism refers to the principles and norms that govern the relationship between individuals and the digital sphere, ensuring the protection of rights and liberties[4]. With the rise of technology and the increasing collection and use of personal data, individuals often find themselves at a disadvantage compared to powerful entities such as corporations and governments[5].
However, the PDPC acts as a regulatory body that safeguards individuals' interests, rectifying the power imbalances and promoting digital constitutionalism. By establishing clear rules and regulations regarding the collection, use, and transfer of personal data, the PDPC may set a framework that ensures the protection of individuals' privacy and data rights. It may enforce accountability among businesses and governments, holding them responsible for their data practices and creating a level playing field where individuals have a say in how their personal data is handled.
3.Millian Liberalism
The need for the institutionalization of the PDPC embodies the paradox of freedom, as raised in John Stuart Mill’s “On Liberty”[6], where Mill recognizes that absolute freedom can lead to the infringement of others' rights and well-being. In this context, the institutionalization of the PDPC acknowledges the necessity of governance to mitigate the risks associated with personal data protection.
In the digital age, the vast amount of personal data collected and processed by various entities raises concerns about privacy, security, and potential misuse. The institutionalization of the PDPC represents a commitment to address these concerns through responsible governance. By setting up rules, regulations, and enforcement mechanisms, the PDPC ensures that individuals' freedoms are preserved without compromising the rights and privacy of others. It strikes a delicate balance between individual autonomy and the broader social interest, shedding light on the paradox of freedom.
II.Legal Positivism: Function and Authority of the PDPC
1.John Austin's Concept of Legal Positivism: Sovereignty, Punishment, Order
To understand the function and authority of the PDPC, we turn to John Austin's concept of legal positivism. Austin posited that laws are commands issued by a sovereign authority and backed by sanctions[7]. Sovereignty entails the power to make and enforce laws within a given jurisdiction.
In the case of the PDPC, its institutionalization by the Legislative Yuan reflects the recognition of its authority to create and enforce regulations concerning personal data protection. The PDPC, as an independent and specialized committee, possesses the necessary jurisdiction and competence to ensure compliance with the law, administer punishments for violations, and maintain order in the realm of personal data protection.
2.Dire Need for the Institutionalization of the PDPC
There has been a dire need for the establishment of the PDPC following the Constitutional Court's decision in August 2022, holding that the government needed to establish a specific agency in charge of personal data-related issues[8]. This need reflects John Austin's concept of legal positivism, as it highlights the demand for a legitimate and authoritative body to regulate and oversee personal data protection. The PDPC's institutionalization serves as a response to the growing concerns surrounding data privacy, security breaches, and the increasing reliance on digital platforms. It signifies the de facto recognition of the need for a dedicated institution to safeguard the individual’s personal data rights, reinforcing the principles of legal positivism.
Furthermore, the institutionalization of the PDPC demonstrates the responsiveness of the legislative branch to the evolving challenges posed by the digital age. The amendment to the Taiwan Personal Data Protection Act and the subsequent institutionalization of the PDPC are the outcomes of a democratic process, reflecting the will of the people and their desire for enhanced data protection measures. It signifies a commitment to uphold the rule of law and ensure the protection of citizens' rights in the face of emerging technologies and their impact on privacy.
3.Authority to Define Cross-Border Transfer of Personal Data
Upon the establishment of the PDPC, it's authority to define what constitutes a cross-border transfer of personal data under Article 21 of the Personal Data Protection Act will then align with John Austin's theory on order. According to Austin, laws bring about order by regulating behavior and ensuring predictability in society.
By granting the PDPC the power to determine cross-border data transfers, the legal framework brings clarity and consistency to the process. This promotes order by establishing clear guidelines and standards, reducing uncertainty, and enhancing the protection of personal data in the context of international data transfers.
The PDPC's authority in this regard reflects the recognition of the need to regulate and monitor the cross-border transfer of personal data to protect individuals' privacy and prevent unauthorized use or abuse of their information. It ensures that the transfer of personal data across borders adheres to legal and ethical standards, contributing to the institutionalization of a comprehensive framework for cross-border data transfer.
III.Conclusion
In conclusion, the institutionalization of the Taiwan Personal Data Protection Committee represents the convergence of legal positivism, digital constitutionalism, and Millian liberalism. It signifies the recognition of citizens' sovereignty over their personal data, rebalances power dynamics in the digital realm, and addresses the paradox of freedom through responsible governance. By analyzing the PDPC's function and authority in the context of legal positivism, we understand its role as a regulatory body to maintain order and uphold the principles of legal positivism. The institutionalization of the PDPC serves as a milestone in Taiwan's commitment to protect individuals' personal data and safeguard the digital rights. In essence, the institutionalization of the Taiwan Personal Data Protection Committee represents a triumph of digital constitutionalism, where individuals' rights and interests are safeguarded, and power imbalances are rectified. It also embodies the recognition of the paradox of freedom and the need for responsible governance in the digital age in Taiwan.
Reference:
[1] Lin Ching-yin & Evelyn Yang, Bill to establish data protection agency clears legislative floor, CNA English News, FOCUS TAIWAN, May 16, 2023, https://focustaiwan.tw/society/202305160014 (last visited, July 13, 2023).
[2] Legal positivism, Stanford Encyclopedia of Philosophy, https://plato.stanford.edu/entries/legal-positivism/?utm_source=fbia (last visited July 13, 2023).
[3] Edoardo Celeste, Digital constitutionalism: how fundamental rights are turning digital, (2023): 13-36, https://doras.dcu.ie/28151/1/2023_Celeste_DIGITAL%20CONSTITUTIONALISM_%20HOW%20FUNDAMENTAL%20RIGHTS%20ARE%20TURNING%20DIGITAL.pdf (last visited July 3, 2023).
[4] GIOVANNI DE GREGORIO, DIGITAL CONSTITUTIONALISM IN EUROPE: REFRAMING RIGHTS AND POWERS IN THE ALGORITHMIC SOCIETY 218 (2022).
[5] Celeste Edoardo, Digital constitutionalism: how fundamental rights are turning digital (2023), https://doras.dcu.ie/28151/1/2023_Celeste_DIGITAL%20CONSTITUTIONALISM_%20HOW%20FUNDAMENTAL%20RIGHTS%20ARE%20TURNING%20DIGITAL.pdf (last visited July 13, 2023).
[6] JOHN STUART MILL, On Liberty (1859), https://openlibrary-repo.ecampusontario.ca/jspui/bitstream/123456789/1310/1/On-Liberty-1645644599.pdf (last visited July 13, 2023).
[7] Legal positivism, Stanford Encyclopedia of Philosophy, https://plato.stanford.edu/entries/legal-positivism/?utm_source=fbia (last visited July 13, 2023).
[8] Lin Ching-yin & Evelyn Yang, Bill to establish data protection agency clears legislative floor, CNA English News, FOCUS TAIWAN, May 16, 2023, https://focustaiwan.tw/society/202305160014 (last visited, July 13, 2023).
Blockchain and General Data Protection Regulation (GDPR) compliance issues (2019) I. Brief Blockchain technology can solve the problem of trust between data demanders and data providers. In other words, in a centralized mode, data demanders can only choose to believe that the centralized platform will not contain the false information. However, in the decentralized mode, data isn’t controlled by one individual group or organization[1], data demanders can directly verify information such as data source, time, and authorization on the blockchain without worrying about the correctness and authenticity of the data. Take the “immutable” for example, it is conflict with the right to erase (also known as the right to be forgotten) in the GDPR.With encryption and one-time pad (OTP) technology, data subjects can make data off-chain storaged or modified at any time in a decentralized platform, so the problem that data on blockchain not meet the GDPR regulation has gradually faded away. II. What is GDPR? The purpose of the EU GDPR is to protect user’s data and to prevent large-scale online platforms or large enterprises from collecting or using user’s data without their permission. Violators will be punished by the EU with up to 20 million Euros (equal to 700 million NT dollars) or 4% of the worldwide annual revenue of the prior financial year. The aim is to promote free movement of personal data within the European Union, while maintaining adequate level of data protection. It is a technology-neutral law, any type of technology which is for processing personal data is applicable. So problem about whether the data on blockchain fits GDPR regulation has raise. Since the blockchain is decentralized, one of the original design goals is to avoid a large amount of centralized data being abused. Blockchain can be divided into permissioned blockchains and permissionless blockchains. The former can also be called “private chains” or “alliance chains” or “enterprise chains”, that means no one can join the blockchain without consent. The latter can also be called “public chains”, which means that anyone can participate on chain without obtaining consent. Sometimes, private chain is not completely decentralized. The demand for the use of blockchain has developed a hybrid of two types of blockchain, called “alliance chain”, which not only maintains the privacy of the private chain, but also maintains the characteristics of public chains. The information on the alliance chain will be open and transparent, and it is in conflict with the application of GDPR. III. How to GDPR apply to blockchain ? First, it should be determined whether the data on the blockchain is personal data protected by GDPR. Second, what is the relationship and respective responsibilities of the data subject, data controller, and data processor? Finally, we discuss the common technical characteristics of blockchain and how it is applicable to GDPR. 1. Data on the blockchain is personal data protected by GDPR? First of all, starting from the technical characteristics of the blockchain, blockchain technology is commonly decentralized, anonymous, immutable, trackable and encrypted. The other five major characteristics are immutability, authenticity, transparency, uniqueness, and collective consensus. Further, the blockchain is an open, decentralized ledger technology that can effectively verify and permanently store transactions between two parties, and can be proved. It is a distributed database, all users on the chain can access to the database and the history record, also can directly verify transaction records. Each nodes use peer-to-peer transmission for upload or transfer information without third-party intermediation, which is the unique “decentralization” feature of the blockchain. In addition, the node or any user on the chain has a unique and identifiable set of more than 30 alphanumeric addresses, but the user may choose to be anonymous or provide identification, which is also a feature of transparency with pseudonymity[2]; Data on blockchain is irreversibility of records. Once the transaction is recorded and updated on the chain, it is difficult to change and is permanently stored in the database, that is to say, it has the characteristics of “tamper-resistance”[3]. According to Article 4 (1) of the GDPR, “personal data” means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person. Therefore, if data subject cannot be identified by the personal data on the blockchain, that is an anonymous data, excluding the application of GDPR. (1) What is Anonymization? According to Opinion 05/2014 on Anonymization Techniques by Article 29 Data Protection Working Party of the European Union, “anonymization” is a technique applied to personal data in order to achieve irreversible de-identification[4]. And it also said the “Hash function” of blockchain is a pseudonymization technology, the personal data is possible to be re-identified. Therefore it’s not an “anonymization”, the data on the blockchain may still be the personal data stipulated by the GDPR. As the blockchain evolves, it will be possible to develop technologies that are not regulated by GDPR, such as part of the encryption process, which will be able to pass the court or European data protection authorities requirement of anonymization. There are also many compliance solutions which use technical in the industry, such as avoiding transaction data stored directly on the chain. 2. International data transmission Furthermore, in accordance with Article 3 of the GDPR, “This Regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union, regardless of whether the processing takes place in the Union or not. This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: (a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union; or (b) the monitoring of their behaviour as far as their behaviour takes place within the Union”.[5] In other words, GDPR applies only when the data on the blockchain is not anonymized, and involves the processing of personal data of EU citizens. 3. Identification of data controllers and data processors Therefore, if the encryption technology involves the public storage of EU citizens' personal data and passes it to a third-party controller, it may be identified as the “data controller” under Article 4 of GDPR, and all nodes and miners of the platform may be deemed as the “co-controller” of the data, and be assumed joint responsibility with the data controller by GDPR. For example, the parties can claim the right to delete data from the data controller. In addition, a blockchain operator may be identified as a “processor”, for example, Backend as a Service (BaaS) products, the third parties provide network infrastructure for users, and let users manage and store personal data. Such Cloud Services Companies provide online services on behalf of customers, do not act as “data controllers”. Some commentators believe that in the case of private chains or alliance chains, such as land records transmission, inter-bank customer information sharing, etc., compared to public chain applications: such as cryptocurrencies (Bitcoin for example), is not completely decentralized, and more likely to meet GDPR requirements[6]. For example, in the case of a private chain or alliance chain, it is a closed platform, which contains only a small number of trusted nodes, is more effective in complying with the GDPR rules. 4. Data subject claims In accordance with Article 17 of the GDPR, The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay under some grounds. Off-chain storage technology can help the blockchain industry comply with GDPR rules, allowing offline storage of personal data, or allow trusted nodes to delete the private key of encrypted information, which leaving data that cannot be read and identified on the chain. If the data is in accordance with the definition of anonymization by GDPR, there is no room for GDPR to be applied. IV. Conclusion In summary, it’s seem that the application of blockchain to GDPR may include: (a) being difficulty to identified the data controllers and data processors after the data subject upload their data. (b) the nature of decentralized storage is transnational storage, and Whether the country where the node is located, is meets the “adequacy decision” of Article 45 of the GDPR. If it cannot be met, then it needs to consider whether it conforms to the transfers subject to appropriate safeguards of Article 46, or the derogations for specific situations of Article 49 of the GDPR. Reference: [1] How to Trade Cryptocurrency: A Guide for (Future) Millionaires, https://wikijob.com/trading/cryptocurrency/how-to-trade-cryptocurrency [2] DONNA K. HAMMAKER, HEALTH RECORDS AND THE LAW 392 (5TH ED. 2018). [3] Iansiti, Marco, and Karim R. Lakhani, The Truth about Blockchain, Harvard Business Review 95, no. 1 (January-February 2017): 118-125, available at https://hbr.org/2017/01/the-truth-about-blockchain [4] Article 29 Data Protection Working Party, Opinion 05/2014 on Anonymisation Techniques (2014), https://www.pdpjournals.com/docs/88197.pdf [5] Directive 95/46/EC (General Data Protection Regulation), https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN [6] Queen Mary University of London, Are blockchains compatible with data privacy law? https://www.qmul.ac.uk/media/news/2018/hss/are-blockchains-compatible-with-data-privacy-law.html
The use of automated facial recognition technology and supervision mechanism in UKThe use of automated facial recognition technology and supervision mechanism in UK I. Introduction Automatic facial recognition (AFR) technology has developed rapidly in recent years, and it can identify target people in a short time. The UK Home Office announced the "Biometrics Strategy" on June 28, 2018, saying that AFR technology will be introduced in the law enforcement, and the Home Office will also actively cooperate with other agencies to establish a new oversight and advisory board in order to maintain public trust. AFR technology can improve law enforcement work, but its use will increase the risk of intruding into individual liberty and privacy. This article focuses on the application of AFR technology proposed by the UK Home Office. The first part of this article describes the use of AFR technology by the police. The second part focuses on the supervision mechanism proposed by the Home Office in the Biometrics Strategy. However, because the use of AFR technology is still controversial, this article will sort out the key issues of follow-up development through the opinions of the public and private sectors. The overview of the discussion of AFR technology used by police agencies would be helpful for further policy formulation. II. Overview of the strategy of AFR technology used by the UK police According to the Home Office’s Biometrics Strategy, the AFR technology will be used in law enforcement, passports and immigration and national security to protect the public and make these public services more efficient[1]. Since 2017 the UK police have worked with tech companies in testing the AFR technology, at public events like Notting Hill Carnival or big football matches[2]. In practice, AFR technology is deployed with mobile or fixed camera systems. When a face image is captured through the camera, it is passed to the recognition software for identification in real time. Then, the AFR system will process if there is a ‘match’ and the alarm would solicit an operator’s attention to verify the match and execute the appropriate action[3]. For example, South Wales Police have used AFR system to compare images of people in crowds attending events with pre-determined watch lists of suspected mobile phone thieves[4]. In the future, the police may also compare potential suspects against images from closed-circuit television cameras (CCTV) or mobile phone footage for evidential and investigatory purposes[5]. The AFR system may use as tools of crime prevention, more than as a form of crime detection[6]. However, the uses of AFR technology are seen as dangerous and intrusive by the UK public[7]. For one thing, it could cause serious harm to democracy and human rights if the police agency misuses AFR technology. For another, it could have a chilling effect on civil society and people may keep self-censoring lawful behavior under constant surveillance[8]. III. The supervision mechanism of AFR technology To maintaining public trust, there must be a supervision mechanism to oversight the use of AFR technology in law enforcement. The UK Home Office indicates that the use of AFR technology is governed by a number of codes of practice including Police and Criminal Evidence Act 1984, Surveillance Camera Code of Practice and the Information Commissioner’s Office (ICO)’s Code of Practice for surveillance cameras[9]. (I) Police and Criminal Evidence Act 1984 The Police and Criminal Evidence Act (PACE) 1984 lays down police powers to obtain and use biometric data, such as collecting DNA and fingerprints from people arrested for a recordable offence. The PACE allows law enforcement agencies proceeding identification to find out people related to crime for criminal and national security purposes. Therefore, for the investigation, detection and prevention tasks related to crime and terrorist activities, the police can collect the facial image of the suspect, which can also be interpreted as the scope of authorization of the PACE. (II) Surveillance Camera Code of Practice The use of CCTV in public places has interfered with the rights of the people, so the Protection of Freedoms Act 2012 requires the establishment of an independent Surveillance Camera Commissioner (SCC) for supervision. The Surveillance Camera Code of Practice proposed by the SCC sets out 12 principles for guiding the operation and use of surveillance camera systems. The 12 guiding principles are as follows[10]: A. Use of a surveillance camera system must always be for a specified purpose which is in pursuit of a legitimate aim and necessary to meet an identified pressing need. B. The use of a surveillance camera system must take into account its effect on individuals and their privacy, with regular reviews to ensure its use remains justified. C. There must be as much transparency in the use of a surveillance camera system as possible, including a published contact point for access to information and complaints. D. There must be clear responsibility and accountability for all surveillance camera system activities including images and information collected, held and used. E. Clear rules, policies and procedures must be in place before a surveillance camera system is used, and these must be communicated to all who need to comply with them. F. No more images and information should be stored than that which is strictly required for the stated purpose of a surveillance camera system, and such images and information should be deleted once their purposes have been discharged. G. Access to retained images and information should be restricted and there must be clearly defined rules on who can gain access and for what purpose such access is granted; the disclosure of images and information should only take place when it is necessary for such a purpose or for law enforcement purposes. H. Surveillance camera system operators should consider any approved operational, technical and competency standards relevant to a system and its purpose and work to meet and maintain those standards. I. Surveillance camera system images and information should be subject to appropriate security measures to safeguard against unauthorised access and use. J. There should be effective review and audit mechanisms to ensure legal requirements, policies and standards are complied with in practice, and regular reports should be published. K. When the use of a surveillance camera system is in pursuit of a legitimate aim, and there is a pressing need for its use, it should then be used in the most effective way to support public safety and law enforcement with the aim of processing images and information of evidential value. L. Any information used to support a surveillance camera system which compares against a reference database for matching purposes should be accurate and kept up to date. (III) ICO’s Code of Practice for surveillance cameras It must need to pay attention to the personal data and privacy protection during the use of surveillance camera systems and AFR technology. The ICO issued its Code of Practice for surveillance cameras under the Data Protection Act 1998 to explain the legal requirements operators of surveillance cameras. The key points of ICO’s Code of Practice for surveillance cameras are summarized as follows[11]: A. The use time of the surveillance camera systems should be carefully evaluated and adjusted. It is recommended to regularly evaluate whether it is necessary and proportionate to continue using it. B. A police force should ensure an effective administration of surveillance camera systems deciding who has responsibility for the control of personal information, what is to be recorded, how the information should be used and to whom it may be disclosed. C. Recorded material should be stored in a safe way to ensure that personal information can be used effectively for its intended purpose. In addition, the information may be considered to be encrypted if necessary. D. Disclosure of information from surveillance systems must be controlled and consistent with the purposes for which the system was established. E. Individuals whose information is recoded have a right to be provided with that information or view that information. The ICO recommends that information must be provided promptly and within no longer than 40 calendar days of receiving a request. F. The minimum and maximum retention periods of recoded material is not prescribed in the Data Protection Act 1998, but it should not be kept for longer than is necessary and should be the shortest period necessary to serve the purposes for which the system was established. (IV) A new oversight and advisory board In addition to the aforementioned regulations and guidance, the UK Home Office mentioned that it will work closely with related authorities, including ICO, SCC, Biometrics Commissioner (BC), and Forensic Science Regulator (FSR) to establish a new oversight and advisory board to coordinate consideration of law enforcement’s use of facial images and facial recognition systems[12]. To sum up, it is estimated that the use of AFR technology by law enforcement has been abided by existing regulations and guidance. Firstly, surveillance camera systems must be used on the purposes for which the system was established. Secondly, clear responsibility and accountability mechanisms should be ensured. Thirdly, individuals whose information is recoded have the right to request access to relevant information. In the future, the new oversight and advisory board will be asked to consider issues relating to law enforcement’s use of AFR technology with greater transparency. IV. Follow-up key issues for the use of AFR technology Regarding to the UK Home Office’s Biometrics Strategy, members of independent agencies such as ICO, BC, SCC, as well as civil society, believe that there are still many deficiencies, the relevant discussions are summarized as follows: (I) The necessity of using AFR technology Elizabeth Denham, ICO Commissioner, called for looking at the use of AFR technology carefully, because AFR is an intrusive technology and can increase the risk of intruding into our privacy. Therefore, for the use of AFR technology to be legal, the UK police must have clear evidence to demonstrate that the use of AFR technology in public space is effective in resolving the problem that it aims to address[13]. The Home Office has pledged to undertake Data Protection Impact Assessments (DPIAs) before introducing AFR technology, including the purpose and legal basis, the framework applies to the organization using the biometrics, the necessity and proportionality and so on. (II)The limitations of using facial image data The UK police can collect, process and use personal data based on the need for crime prevention, investigation and prosecution. In order to secure the use of biometric information, the BC was established under the Protection of Freedoms Act 2012. The mission of the BC is to regulate the use of biometric information, provide protection from disproportionate enforcement action, and limit the application of surveillance and counter-terrorism powers. However, the BC’s powers do not presently extend to other forms of biometric information other than DNA or fingerprints[14]. The BC has expressed concern that while the use of biometric data may well be in the public interest for law enforcement purposes and to support other government functions, the public benefit must be balanced against loss of privacy. Hence, legislation should be carried to decide that crucial question, instead of depending on the BC’s case feedback[15]. Because biometric data is especially sensitive and most intrusive of individual privacy, it seems that a governance framework should be required and will make decisions of the use of facial images by the police. (III) Database management and transparency For the application of AFR technology, the scope of biometric database is a dispute issue in the UK. It is worth mentioning that the British people feel distrust of the criminal database held by the police. When someone is arrested and detained by the police, the police will take photos of the suspect’s face. However, unlike fingerprints and DNA, even if the person is not sued, their facial images are not automatically deleted from the police biometric database[16]. South Wales Police have used AFR technology to compare facial images of people in crowds attending major public events with pre-determined watch lists of suspected mobile phone thieves in the AFR field test. Although the watch lists are created for time-limited and specific purposes, the inclusion of suspects who could possibly be innocent people still causes public panic. Elizabeth Denham warned that there should be a transparency system about retaining facial images of those arrested but not charged for certain offences[17]. Therefore, in the future the UK Home Office may need to establish a transparent system of AFR biometric database and related supervision mechanism. (IV) Accuracy and identification errors In addition to worrying about infringing personal privacy, the low accuracy of AFR technology is another reason many people oppose the use of AFR technology by police agencies. Silkie Carlo, director of Big Brother Watch, said the police must immediately stop using the AFR technology and avoid mistaking thousands of innocent citizens as criminals; Paul Wiles, Biometrics Commissioner, also called for legislation to manage AFR technology because of its accuracy is too low and the use of AFR technology should be tested and passed external peer review[18]. In the Home Office’s Biometric Strategy, the scientific quality standards for AFR technology will be established jointly with the FSR, an independent agency under the Home Office. In other words, the Home Office plans to extend the existing forensics science regime to regulate AFR technology. Therefore, the FSR has worked with the SCC to develop standards relevant to digital forensics. The UK government has not yet seen specific standards for regulating the accuracy of AFR technology at the present stage. V. Conclusion From the discussion of the public and private sectors in the UK, we can summarize some rules for the use of AFR technology. Firstly, before the application of AFR technology, it is necessary to complete the pre-assessment to ensure the benefits to the whole society. Secondly, there is the possibility of identifying errors in AFR technology. Therefore, in order to maintain the confidence and trust of the people, the relevant scientific standards should be set up first to test the system accuracy. Thirdly, the AFR system should be regarded as an assisting tool for police enforcement in the initial stage. In other words, the information analyzed by the AFR system should still be judged by law enforcement officials, and the police officers should take the responsibilities. In order to balance the protection of public interest and basic human rights, the use of biometric data in the AFR technology should be regulated by a special law other than the regulations of surveillance camera and data protection. The scope of the identification database is also a key point, and it may need legislators’ approval to collect and store the facial image data of innocent people. Last but not least, the use of the AFR system should be transparent and the victims of human rights violations can seek appeal. [1] UK Home Office, Biometrics Strategy, Jun. 28, 2018, https://www.gov.uk/government/publications/home-office-biometrics-strategy (last visited Aug. 09, 2018), at 7. [2] Big Brother Watch, FACE OFF CAMPAIGN: STOP THE MET POLICE USING AUTHORITARIAN FACIAL RECOGNITION CAMERAS, https://bigbrotherwatch.org.uk/all-campaigns/face-off-campaign/ (last visited Aug. 16, 2018). [3] Lucas Introna & David Wood, Picturing algorithmic surveillance: the politics of facial recognition systems, Surveillance & Society, 2(2/3), 177-198 (2004). [4] Supra note 1, at 12. [5] Id, at 25. [6] Michael Bromby, Computerised Facial Recognition Systems: The Surrounding Legal Problems (Sep. 2006)(LL.M Dissertation Faculty of Law University of Edinburgh), http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.197.7339&rep=rep1&type=pdf , at 3. [7] Owen Bowcott, Police face legal action over use of facial recognition cameras, The Guardian, Jun. 14, 2018, https://www.theguardian.com/technology/2018/jun/14/police-face-legal-action-over-use-of-facial-recognition-cameras (last visited Aug. 09, 2018). [8] Martha Spurrier, Facial recognition is not just useless. In police hands, it is dangerous, The Guardian, May 16, 2018, https://www.theguardian.com/commentisfree/2018/may/16/facial-recognition-useless-police-dangerous-met-inaccurate (last visited Aug. 17, 2018). [9] Supra note 1, at 12. [10] Surveillance Camera Commissioner, Surveillance camera code of practice, Oct. 28, 2014, https://www.gov.uk/government/publications/surveillance-camera-code-of-practice (last visited Aug. 17, 2018). [11] UK Information Commissioner’s Office, In the picture: A data protection code of practice for surveillance cameras and personal information, Jun. 09, 2017, https://ico.org.uk/for-organisations/guide-to-data-protection/encryption/scenarios/cctv/ (last visited Aug. 10, 2018). [12] Supra note 1, at 13. [13] Elizabeth Denham, Blog: facial recognition technology and law enforcement, Information Commissioner's Office, May 14, 2018, https://ico.org.uk/about-the-ico/news-and-events/blog-facial-recognition-technology-and-law-enforcement/ (last visited Aug. 14, 2018). [14] Monique Mann & Marcus Smith, Automated Facial Recognition Technology: Recent Developments and Approaches to Oversight, Automated Facial Recognition Technology, 10(1), 140 (2017). [15] Biometrics Commissioner, Biometrics Commissioner’s response to the Home Office Biometrics Strategy, Jun. 28, 2018, https://www.gov.uk/government/news/biometrics-commissioners-response-to-the-home-office-biometrics-strategy (last visited Aug. 15, 2018). [16] Supra note 2. [17] Supra note 13. [18] Jon Sharman, Metropolitan Police's facial recognition technology 98% inaccurate, figures show, INDEPENDENT, May 13, 2018, https://www.independent.co.uk/news/uk/home-news/met-police-facial-recognition-success-south-wales-trial-home-office-false-positive-a8345036.html (last visited Aug. 09, 2018).
Taiwan Planed Major Promoting Program for Biotechnology and Pharmaceutical IndustryTaiwan Government Lauched the “Biotechnology Action Plan” The Taiwan Government has planned to boost the support and develop local industries across the following six major sectors: biotechnology, tourism, health care, green energy, innovative culture and post-modern agriculture. As the biotechnology industry has reached its maturity by the promulgation of "Biotech and New Pharmaceutical Development Act" in July, 2007, it will be the first to take the lead among the above sectors. Thus, the Executive Yuan has launched the Biotechnology Action Plan as the first project in building the leading industry sectors, to upgrade local industries and stimulate future economic growth. Taiwan Government Planed to Promote the Biotechnology and Other newly Industries by Investing Two Hundred Billion To expand every industrial scale, enhance industrial value, increase the value around the main industrial field, and to encourage the industrial development by government investments for creating the civil working opportunities to reach the goal of continuous economic development, the Executive Yuan Economic Establishment commission has expressed that, the government has selected six newly industrials including "Biotechnology", "Green Energy", "Refined Agriculture", "Tourism", "Medicare", and "Culture Originality" on November 19, 2009 to promote our national economic growth. The government will invest two hundred billion NT dollars to support the industrial development aggressively and to enhance the social investments from year 2009 to 2012. According to a Chung-Hua Institution for Economic Research report, the future growth rate will reach 8.16% after the evaluation, Hence, the future of the industries seems to be quite bright. Currently, the government plans to put money into six newly industries through the existing ways for investment. For instance, firstly, in accordance with the "Act For The Development Of Biotech And New Pharmaceuticals Industry" article 5 provision 1 ",for the purpose of promoting the Biotech and New Pharmaceuticals Industry, a Biotech and New Pharmaceuticals Company may, for a period of five years from the time it is subject to corporate income tax, enjoy a reduction in its corporate income tax payable for up to thirty-five percent (35%) of the total funds invested in research and development ("R&D") and personnel training each year; provided, however, that if the R&D expenditure of a particular year exceeds the average R&D expenditure of the previous two years or if the personnel training expenditure of a particular year exceeds the average personnel training expenditure of the previous two years, fifty percent (50%) of the amount in excess of the average may be used to credit against the amount of corporate income tax payable. Secondly, according to same act of the article 6 provision 1 ", in order to encourage the establishment or expansion of Bio tech and New Pharmaceuticals Companies, a profit-seeking enterprise that (i) subscribes for the stock issued by a Biotech and New Pharmaceuticals Company at the time of the latter's establishment or subsequent expansion; and (ii) has been a registered shareholder of the Biotech and New Pharmaceuticals Company for a period of three (3) years or more, may, for a period of five years from the time it is subject to corporate income tax, enjoy a reduction in its corporate income tax payable for up to twenty percent (20%) of the total amount of price paid for the subscription of shares in such Biotech and New Pharmaceuticals Company; provided that such Biotech and New Pharmaceuticals Company has not applied for exemption from corporate income tax or shareholders investment credit based on the subscription price under other applicable laws and regulations. Thirdly, to promote the entire biotechnological industry development, the government has drafted the "Biotechnology Takeoff Package" for subsidizing the startup´s social investment companies which can satisfy the conditions to invest in "Drug discovery", "Medical Device" or other related biotech industries up to 5 billion with the capital invest in domestic industry over 50%, , with operating experience of multinational biotech investment companies with capital over 150 million in related industrial fields, and with the working experiences of doctor accumulated up to 60 years. Additionally, the refined agriculture industry field has not only discovered the gene selected products, but also combined the tourism with farming business for new business model creation. According to the "Guidelines for Preferential Loans for the Upgrading of Tourism Enterprises" point 4 provision 1, the expenditure for spending on machine, instruments, land or repairing can be granted a preferential loan in accordance with the rule of point 6, and government will provide a subsidy of interest for loaning Tourism Enterprises with timely payments. At last, Council for Economic Planning and Development also points out because most of technology industry has been impacted seriously by fluctuation of international prosperity due to conducting the export trade oriented strategy. Furthermore, the aspects of our export trade of technology industry have been impacted by the U.S. financial crisis and the economic decay in EU and US; and the industrial development seems to face the problem caused by over centralization in Taiwan. Hence, the current framework of domestic industry should be rearranged and to make it better by promoting the developmental project of six newly industries. Taiwan Government Had Modifies Rules to Accelerate NDA Process and Facilitate Development of Clinical Studies in Taiwan In July 2007, the "Biotech and New Pharmaceutical Development Act" modified many regulations related to pharmaceutical administration, taxes, and professionals in Taiwan. In addition, in order to facilitate the development of the biotechnology and pharmaceutical industries, the government has attempted to create a friendly environment for research and development by setting up appropriate regulations and application systems. These measures show that the Taiwanese government is keenly aware that these industries have huge potential value. To operate in coordination with the above Act and to better deal with the increasing productivity of pharmaceutical R&D programs in Taiwan, the Executive Yuan simplified the New Drug Application (NDA) process to facilitate the submission that required Certificate of Pharmaceutical Product (CPP) for drugs with new ingredients. The current NDA process requires sponsors to submit documentation as specified by one of the following four options: (1) three CPPs from three of "ten medically-advanced countries," including Germany, the U.S., England, France, Japan, Switzerland, Canada, Australia, Belgium, and Sweden; (2) one CPP from the U.S., Japan, Canada, Australia, or England and one CPP from Germany, France, Switzerland, Sweden, or Belgium; (3) a Free Sale Certificate (FSC) from one of ten medically-advanced countries where the pharmaceuticals are originally produced and one CPP from one of the other nine countries; or (4) a CPP from the European Medicines Agency. Thus, the current NDA process requires sponsors to spend inordinate amounts of time and incur significant costs to acquire two or three FSCs or CPPs from ten medically-advanced countries in order to submit an NDA in Taiwan. According to the new rules, sponsors will not have to submit above CPPs if (1) Phase I clinical studies have been conducted in Taiwan, and Phase III Pivotal Trial clinical studies have been simultaneously conducted both in Taiwan and in another country or (2) Phase II and Phase III Pivotal Trial clinical studies have been simultaneously conducted both in Taiwan and in another country. Besides, the required minimum numbers of patients were evaluated during each above phase. Therefore, sponsors who conduct clinical studies in Taiwan and in another country simultaneously could reduce their costs and shorten the NDA process in Taiwan. The new rules aim to encourage international pharmaceutical companies to conduct clinical studies in Taiwan or to conduct such studies cooperatively with Taiwanese pharmaceutical companies. Such interactions will allow Taiwanese pharmaceutical companies to participate in development and implementation of international clinical studies in addition to benefiting from the shortened NDA process. Therefore, the R&D abilities and the internationalization of the Taiwanese pharmaceutical industry will be improved.
Suggestions for MOEA Trial Program of Voluntary Base Green Electricity FrameworkOn March 6, 2014, The Energy Bureau of Ministry of Economic Affairs has published a pre-announcement on a Trial Program of Voluntary Base Green Electricity Framework (hereafter the Trial Program) and consulted on public opinion. In light of the content of the Trial Program, STLI provide the following suggestions for future planning of related policy structure. The institution of green electricity as established by the Trial Program is one of the policies for promoting renewable energy. Despite its nature of a trial, it is suggested that a policy design with a more options will be beneficial to the promotion of renewable energy, in light of various measures that have been undertaken by different countries. According to the Trial Program, the planned price rate of the green electricity is set on the basis of the total sum that the electricity subsidy to be paid by the Renewable Energy Development Fund divided by the total sum of electricity generated reported by Tai Power Company. The Ministry of Economic Affairs will adjust the price rate of the green electricity on the base of both how many users subscribe to the green electricity and the price rate of international green electricity market rate and, then announce the price rate in October of each year if not otherwise designated. In addition, according to the planned Trial Program, the unit for the subscription of green electricity is 100 kW·h. It is further reported that the current planned price rate for green electricity is 1.06 NTD/ kW·h. And it shall be 3.95 NTD/ kW·h if adding up with the original price rate, with an 37% increase in price per kW·h. In terms of the existing content of the Trial Program, only single price rate will be offered during the trial period. In this regard, we take the view that it would be beneficial to take into account similar approaches that have been taken by other countries. In Germany, for instance, the furtherance of renewable energy is achieved by the obligatory charge(EEG Umlage)together with the voluntary green electricity program provided by the private electricity retail sectors. According to German Ministry of Economics and Energy (BMWi), the electricity price that the German public pays includes three parts: (1)the cost of the purchase and distribution of the electricity, including the margin of the electricity provider(2)regulated network fees, including those for the operation as well as for the measurement works of the meters(3)charges imposed by the government, including tax and the abovementioned obligatory charge for renewable energy(EEG Umlage), as prescribed by the Act on Renewable Energy (Gesetz für den Vorrang Erneuerbarer Energien, also known as Erneuerbare-Energien-Gesetz - EEG). In terms of how it is implemented on the ground, an example of the green electricity price menu program from the German electricity retail company, Vattenfall, is given in the following. In all price menu programs provided by Vattenfall in Berlin, for instance, 29.4% of the electricity comes from renewable energy as a result of the implementation of the Act on Renewable Energy. Asides from the abovementioned percentage as facilitated by the existing obligatory measures, the electricity retail companies in Germany further provide the price menus that are “greener”. For example, among the options provided by Vattenfall(Chart I), in terms of the 12-month program, one can choose the menu which consist of 39.4% of renewable energy, with the price of 0.2642 Euro/ kW·h(about 10.96 NTD/ kW·h). One can also opt for a menu of which the energy supply comes from 100% of renewable energy, with the price of 0.281 Euro/ kW·h(about 11.66 NTD/ kW·h) Chart I : Green Electricity Price Menus provided by Vattenfall in Berlin, Germany Percentage of Renewable Energy Supply Percentage of Renewable Energy Supply Electricity Price 12-month program 39.4% 0.2642 Euro/ kW·h(about 10.96 NTD/ kW·h) All renewable energy program 100% 0.281 Euro/ kW·h(about 11.66 NTD/ kW·h) Source:Vattenfall website, translated and reorganized by STLI, April 214. In addition, Australia also has similar programs on green electricity that is voluntary-base and with the goal of promoting renewable energy, reducing carbon emission, and transforming energy economy. Since 1997, the GreenPower in Australia is in charge of audition and certification of the retail companies and power plants on green electricity. The Australian model uses the certification mechanism conducted by independent third party, to ensure the green electricity purchased by end users in compliance with specific standards. As for the options for the price menu, take the programs of green electricity offered by the Australian retail company Origin Energy for example, user can choose 6 kinds of different programs, which are composed by renewable energy supply of respectively 10%, 20%, 25%, 50%, 75%, and 100%, at various price rates (shown in Chart II). Chart II Australian Green Electricity Programs provided by Origin Energy Percentage of renewable Energy Electricity Price per kW·h 0 0.268 AUD(About 7.52 NTD) 10% 0.274868 AUD(About 7.69 NTD) 20% 0.28006 AUD(About 7.84 NTD) 25% 0.28292 AUD(About 7.92 NTD) 50% 0.2838 AUD(About 7.95 NTD) 100% 0.2992 AUD(About 8.37 NTD) Source:Origin Energy website, translated and reorganized by STLI, April 214. Given the information above, it can thus be inferred that the international mechanism for the promotion of green electricity often include a variety of price menus, providing the user more options. Such as two difference programs offered by Vattenfall in Germany and six various rates for green electricity offered by Origin Energy in Australia. It is the suggestion of present brief that the Trial Program can reference these international examples and try to offer the users a greater flexibility in choosing the most suitable programs for themselves.