w88 casino News Network (provided by the Law w88 casino)On the afternoon of November 14, 2021, the "Seminar on Algorithm Governance and Copyright Protection Issues" hosted by the Law w88 casino of the w88 casino was successfully held. The seminar was held in the form of an online conference. More than 20 experts from more than ten universities, courts and practical institutions, including Zhongnan University of Economics and Law, Shanghai Jiao Tong University, Chinese Academy of Social Sciences, East China University of Political Science and Law, China University of Political Science and Law, Tsinghua University, Suzhou University, Intellectual Property Judicial Protection Research Center of the Supreme People's Court, Shanghai Higher People's Court, and Beijing Haidian District People's Court, participated in the seminar.
Wu Handong, former president of Zhongnan University of Economics and Law and senior professor Wenlan, and Professor Mei Xiaying, dean of the Law w88 casino of the w88 casino, delivered speeches. Experts attending the meeting conducted extensive and in-depth discussions on topics such as "the relationship between algorithm recommendations and technology neutrality", "the legal nature of algorithm-pushed content", "the important value of new technology applications in copyright protection" and "modern reform of notification and deletion rules", and formed a series of very constructive opinions. The seminar was hosted by Professor Lu Haijun from the w88 casino of Law, w88 casino.
Seminar on “The Relationship between Algorithm Recommendation and Technology Neutrality”
Experts participating in the meeting believe that,Algorithm recommendation is a specific application of the algorithm. It is different from the algorithm technology itself, and it is difficult to meet the requirements of "technology neutrality".
Experts believe that a distinction should be made between “algorithm” itself and “algorithm recommendation”. There is no absolute technology neutrality in the platform’s “algorithm recommendation”.
Professor Guan Yuying, director of the Intellectual Property Office of the Institute of Law, Chinese Academy of Social Sciences, pointed out that the "technology neutrality" defense principle in the field of copyright infringement defense originated from the "Sony Case" in the United States in the 1980s and the "Grokster Case" in the early 2000s, but the judges in the two cases gave opposite verdicts. It is worth noting that in the "Sony case" where the technology neutrality defense was established, the technology neutrality principle applies to the scenario where the equipment provider is not liable for assisting infringement in the recording of copyrighted works within the home and acts that constitute fair use; while the "Grokster case" involves the unauthorized dissemination and use of copyrighted works by individual users in a network environment, and the technology neutrality defense of the defendant to provide assistance for this behavior is not established.
Professor Kong Xiangjun, Dean of Shanghai Jiao Tong University Koyuan Law w88 casino, pointed out that whether algorithm recommendation is technology neutral depends on whether the algorithm can be set, selected and controllable. If the platform has realistic control over the algorithm, the neutrality of algorithm recommendations is generally difficult to establish.
Lin Ziying, a researcher at the Intellectual Property Judicial Protection Research Center of the Supreme People's Court, also said that whether algorithm recommendations are neutral depends on the function of the platform and the purpose of using the algorithm. It is not appropriate to talk about its neutrality completely divorced from its purpose of use.
Judge Yang Dejia, president of the Intellectual Property Division of the Haidian District People’s Court of Beijing, pointed out directly that the application of technology, especially market-oriented and large-scale applications, can never be neutral in the true sense. The commercial and market-oriented technology applications we can see have clear purposes for business entities, are the result of precise calculations and w88-offs of interests, and reflect the distinct value pursuits of their users.
Experts believe that the neutrality of the “algorithm” technology itself is also questionable. There is subjectivity in the algorithm design and development stage, including the designer's selection bias and values.
Jiang Ge, associate professor and doctoral supervisor at Tsinghua University Law w88 casino, pointed out that algorithms are essentially cognitive tools and a simplification of the real world. Algorithm design reflects a large number of designers’ choices, arrangements, and values.
Li Yang, vice president of the China Intellectual Property Law Research Association and professor at the Civil, Commercial and Economic Law w88 casino of China University of Political Science and Law, even said that recommendation algorithms have had strong value judgments since their birth. They are not technology-neutral products, but are rooted in specific usage scenarios. It can be said that they are full of evil from "birth". The use of recommendation algorithms by content platforms has obviously involved the network dissemination of work information. The intention to use the so-called "neutrality" of the recommendation algorithm to cover up the "purpose" of the use of the recommendation algorithm will obscure the issue of piracy and infringement.
Xu Xu, executive director of the Digital Economy and Legal Innovation Research Center of the w88 casino and associate professor of the w88 casino of Law, pointed out that from the perspective of public law and industry supervision, algorithm neutrality is a false proposition. The country recently issued a series of regulations such as the "Internet Information Service Algorithm Recommendation Management Regulations (Draft for Comments)" and other regulations, emphasizing that algorithms are good and progressive. The positive obligations of algorithm platforms should not only remain at the level of not breaking the law, but also need to actively push information content that is in line with mainstream values, optimize all aspects including retrieval, sorting, push, display, etc., and avoid information cocoons. Algorithms have always contained rich values and cannot be neutral.
Research on "Legal Attributes of Algorithm Pushed Content"
Experts participating in the meeting believed that,“Lifting the veil of algorithm recommendation”: In essence, the platform uses algorithms as content push tools, and the legal attributes and responsibilities of algorithm recommendation behavior must be scientifically defined.
Judge Yang Dejia said that we need to be wary of the tendency to personify and mystify algorithms and artificial intelligence in recent years in discussions on cutting-edge issues; we should look at the essence through the phenomenon to prevent the real responsible subject from being blurred. The relationship between algorithm recommendation and platform operators is that of tools and tool users. The "mystery" of the algorithm should be lifted, the user of the algorithm recommendation should be identified, and what behaviors the algorithm recommendation user has performed to produce infringement results should be explored, whether they are at fault, and whether they should bear responsibility.
Experts believe that algorithm recommendation is only a means for platforms to distribute content. Platforms that use algorithms to push content must reconsider their role.
Professor Wu Handong said that now we have entered an intelligent era, and the characteristic of the intelligent era is the ubiquity of algorithms. Algorithm recommendation has brought about a great change in platform information services. In the past, people looked for information, which was passive; now, information looks for people, which is proactive. In terms of role, the platform used to be a neutral news provider, but now it may become a more active content participant.
Xu Chao, former deputy director of the Copyright Department of the National Copyright Administration, pointed out that if the recommended works are works, copyright issues will arise. Those who use algorithm recommendation technology to provide works to users belong to ICP and are direct infringements. Safe harbor provisions cannot apply to direct infringement. The safe harbor provisions only apply to indirect infringement. The "network service provider" in Article 1194 of the Civil Code, including ICP, should be a direct infringer. Only the network service providers in Articles 1195 to 1197 are indirect infringers, and can the safe harbor provisions of takedown notification apply. If a platform uses algorithmic recommendation technology to directly provide works to users, even under the banner of the platform, the platform is essentially a direct infringer and cannot apply to the safe harbor regulations. If the platform only provides algorithms to users and causes copyright disputes, the platform shall apply the safe harbor regulations. As the judgment in the Grokster case stated: To determine whether a platform bears indirect liability for infringement, it is necessary to prove the existence of direct infringement.
Professor Guan Yuying said that the so-called ISP is a network service provider that does not perform any processing, selection, or arrangement of content. It is only a technical access service or storage service and has no involvement in the content provision itself. However, using algorithms to recommend content on the platform actually involves direct processing of content.
Experts believe that algorithmic recommendation and human recommendation are only different in means, and there is no essential difference in legal attributes. Liability for content infringement generated by algorithm recommendations should be more appropriately assigned to the platform as a gainer and risk generator.
Professor Dong Binghe of Suzhou University Wang Jian Law w88 casino proposed that the algorithm itself may be value-neutral, but this has nothing to do with the principle of technology neutrality. The principle of technology neutrality requires that the law be neutral with respect to technology, without preference or discrimination. It should be said that the copyright infringement liability caused by algorithm recommendation has nothing to do with the algorithm itself, because there is no essential difference in legal consequences whether it is human recommendation or algorithm recommendation.
Professor Cong Lixian, Dean of the w88 casino of Intellectual Property of East China University of Political Science and Law, pointed out that in the development history of copyright law for more than 300 years, new technologies have mainly played the role of communication tools. If the platform essentially uses algorithms to spread works, just like content providers using other technologies to spread works, they should bear copyright responsibilities and duties of care in accordance with the established rules of copyright law.
Judge Xu Jun, deputy chief judge of the Intellectual Property Tribunal of the Shanghai Higher People's Court, pointed out that algorithmic push only changes the way information is distributed and delivered in the past, making information delivery more accurate, and does not change the nature of its information network dissemination behavior. Although it is true that individual platform users receive different push content, from a group perspective, audiences with similar interests or habits still receive the same information. A large amount of information will still be pushed to a large number of audiences, but in the past it was pushed to everyone regardless of each other, but now it is pushed to certain types of audiences selectively. Some push information will be periodically updated on the homepage due to different business models and application platforms, but this is only not visible in a prominent position on the homepage. The audience can still search for the platform's own content at the time and place of their choice. Platforms that use algorithmic recommendations cannot be exempted from their copyright obligations in information network dissemination simply because of the information push technology they use.
Seminar on "The Important Value of New Technology Applications in Copyright Protection" and "Modern Reform of Notification and Deletion Rules"
Experts participating in the meeting believed that,The historical limitations of the "notice-delete" rule are becoming increasingly obvious: attention should be paid to the application of copyright protection technologies such as copyright identification and blocking, and scientific copyright protection responsibilities should be allocated to the platform.
Experts believe that the “safe harbor system” is a product of an era when copyright protection technology was underdeveloped. Times have changed, and against the backdrop of increasingly mature copyright protection technologies in the industry, the platform’s duty of care in copyright protection must be strengthened.
Professor Wu Handong said that algorithm recommendation platforms can not only use algorithms to recommend online content, but also use algorithms to monitor and screen infringing content. If the platform has this kind of technical monitoring capability and can do things but fails to do so, it is worth reflecting on whether it should bear corresponding legal responsibilities.
Professor Kong Xiangjun pointed out that the copyright infringement problem caused by new technologies such as algorithm recommendation should be solved through technical means. It is undeniable that in the field of hot issues such as short video copyright infringement, copyright filtering technology has obviously made substantial progress in recent years. The platform's duty of care should match the current technical situation. Since algorithms and other technologies have improved and push capabilities have been enhanced, the duty of care in copyright infringement filtering must also be strengthened accordingly.
Hu Huiji, Legal Director of iQiyi, pointed out that platforms such as YouTube have introduced copyright filtering technologies such as Content ID. Through corresponding anti-piracy technology, more than 99% of infringing content can be blocked. Since overseas platforms can achieve the above-mentioned efforts in copyright protection technology, domestic top algorithm recommendation platforms are also fully capable of doing so.
Judge Zhang Ying of the Shanghai Higher People’s Court also said that the platform’s use of algorithm recommendation technology will trigger a higher duty of care. The platform has a certain ability to foresee the infringing works pushed by the algorithm. The algorithm is a concrete embodiment of the platform rules and a reflection of the platform's will. Although the algorithm may cause black box problems during operation, the final results produced by the algorithm are still within the controllable and foreseeable range of the platform. The platform has made a lot of profits from algorithmic personalized recommendations, but at the same time it has increased the risk of copyright infringement. It should bear a corresponding duty of care for the content it pushes.
Experts believe that it is necessary to prevent the "notice-takedown" rule from becoming a talisman for platforms to indulge in copyright infringement, and actively explore the specific application of "notification + necessary measures".
Professor Li Yang said that the “notice-delete” rule is outdated. In judicial practice, the “notification + necessary measures” rule stipulated in the Civil Code should still be applied as much as possible. Since the platform has implemented pre-identification and filtering of illegal content such as pornography, terrorism, and violence in order to meet the requirements of public law, it must also have the technical capabilities to implement copyright protection. At least those copyrighted works that are in the popular period can be reviewed and filtered in advance. However, platforms often use so-called technical incompetence or technical neutrality as an excuse to infringe copyright.
Professor Guan Yuying pointed out that today, compared with the era when the "notice-delete" rule was born, our technology, algorithms and computing power have been greatly improved, and the platform's duty of care in copyright protection should also be improved accordingly. Judicial cases show that platforms can block and filter copyright infringing content after the fact, so they should be able to do so beforehand.
Judge Yang Dejia pointed out that the safe harbor rules created by the U.S. Digital Millennium Copyright Act (DMCA) in 1998 were subject to the technical level and application environment at the time. The current technological development, especially the development of content sharing platforms and algorithm recommendation technology, has actually completely broken the balance of interests designed by legislators back then. What the world now recognizes is that compared with the era when the "safe harbor" system was created, "the wind is no longer what it used to be." In order to achieve a dynamic rebalancing of interests, today's safe harbor "must not remain the same as it was back then." The conditions for entering a safe harbor need to be adjusted in a timely and appropriate manner.
Researcher Qu Haohui from the Intellectual Property Research Center of Zhongnan University of Economics and Law said that traditional platform safe harbor rules are no longer able to cope with the current development status of the copyright content industry. On the one hand, in the industrial chain, platforms obtain traffic and revenue from content push; on the other hand, in terms of technology application and dispute response, platforms have more experience and advantages than users or rights holders. Therefore, when using advanced algorithms to push content, the platform cannot abuse its advantageous position and leave "feathers in one place" to users or regulatory authorities. When technology is uncontrollable, platforms should insist on protecting rights and appropriately abandon some business models.
During the discussion, Fang Xiaoyu, a recommendation algorithm engineer from Kingsoft Cloud, and Zhang Xiaobo, a senior algorithm expert from Ant Financial, also explained in detail the basic principles of algorithm recommendation and the application of new technologies in copyright protection.
Finally, Professor Lu Haijun pointed out in his concluding speech at the seminar that the discussions among the participating experts were based on a basic consensus, that is, copyright protection is the strong voice and main theme of the times. Although my country's copyright protection has made great progress, there is still a lot of room for growth and progress. It is hoped that with the development and application of copyright protection technology, the industry's copyright protection level and environment will get better and better. The original intention and goal of this seminar is to create a benign and healthy online copyright ecological environment, with stronger copyright protection and more efficient copyright dissemination, so that more and better genuine content can serve the development of the entire social economy and culture.