President's veto. Will Karol Nawrocki halt the Industrial Censorship Complex?

pch24.pl 3 weeks ago

It is good that president Karol Nawrocki vetoed the Censorial Act of 18 December 2025 amending the Act on Electronic Services and certain another laws, de facto implementing the provisions of the EU Digital Services Act (DSA) in our country. However, this is not the end of the conflict for freedom of speech, given the constant force exerted by the EC and the announced further actions of the government of Donald Tusk.

The president himself called for improved regulation to be prepared in the next 2 months.

A number of initiatives funded from our taxes have already been decided and started to make the Industrial Censorship Complex, which is being dismantled and expanded in the US.

Criticism of veto

President Nawrocki's veto to the DSA implementing bill did not like the leftist environment active in censorship of the statements of Poles for years. 1 specified organisation utilizing EU and national grants is Demagog, an association founded in 2014 to fight mis-, mal- and disinformation on the network.

The association on its portal criticized the President's decision. The EU regulation, although there are no provisions to facilitate its direct implementation in our country, applies in all EU countries. However, the national law extended the scope of EU regulation.

Among Demagog's allegations, there are no arguments that the president is unnecessarily questioning the notion of "illegal content" which is to be removed immediately from the network, suggesting that it is only about those content that criminalizes the penal code, specified as "fraud, impersonation of a person, pedophile content, piracy or incitement to hatred against national, ethnic, racial or spiritual grounds". The president challenged the "reclamation" of freedom of expression, which ensures Article 54 of the Polish Constitution.

“As we explained in 1 of our analyses,” Demagog points out, the rules that the president blocked from entering into force only concern circumstantial content that violates the law. The concerns of any politicians and social activists that the fresh law will limit the anticipation of criticism by the government, the European Union or efforts to equality LGBT+ or to express their support for conservative values have no basis in this bill."

And that is the disinformation (a concept that I very much dislike to usage due to deficiency of its precision, but which has been widely accepted and for the purposes of this article I frequently recall). In the analysis “Mission, disinformation and malinformation. As a global censorship strategy is being built, I am more detailing the objectives of the DSA regulation, what were the intentions of EU legislators, how broad the definition of "illegal content" is, and what in peculiar 3 dangerous legal mechanisms have been created in this regulation, as a group of Dutch lawyers pointed out.

The DSA Regulation has formulated the concept of "illegal content" (Article 3), including all cases of sanctioned content, including disinformation as defined in national legislation, and "a hatred speech" (including divergent national approaches, although Eurocrats have declared that they want to harmonise rules across the EU).

So it is up to the authorities of a peculiar country to decide what will be “illegal”. It is adequate for the Sejm to have a political formation or coalition that will get a majority and have a ‘its’ president to push through the applicable law, for example, on combating misinformation, ‘talk of hatred’ or abroad interference with social media, then the catalogue of ‘illegal content’ will drastically grow and let the rulers to engage in far-reaching interference and manipulation of public opinion, enforcing prohibitions that, at the time of adoption of the DSA, were not even ‘illegal content’.

According to Article 3h of the Digital Services Act, ‘illegal content’ is ‘information which, in itself or by mention to action, including the sale of products or the provision of services, does not comply with Union law or with the law of any associate State which complies with Union law, regardless of the circumstantial object or nature of that law’.

This is an highly broad and imprecise definition, which, as a group of Dutch scholars (Fathaigh, Helberger, Appelman, "The periodicals of legally defining disinformation") "is not fit to function as a legal category". This is simply a political definition that allows associate States to interpret widely the phenomenon of misinformation, “hate speech”, “security” in the network, etc.

Let us add that the DSA commits "very large net platforms" and "very large net search engines" to an yearly "system hazard assessment" (Article 34) resulting from the operation of their services and related algorithmic systems.

The hazard catalog is extended and includes, for example, "the occurrence of actual or foreseeable negative impacts on civilian discourse and electoral processes". Large online platforms and search engines must adapt their systems of algorithmic content and advertising moderation to these risks.

Article 34 refers, in turn, to "the deliberate manipulation of their service, including as a consequence of non-authentic usage of the service or automated usage of the service, as well as the acceptance and possibly fast and wide dissemination of illegal content and information which are incompatible with their terms of use".

In addition, the request to take account of national specificities, including those which consider certain content to be "illegal". The hazard assessment is audited and so the hazard assessment documentation must be kept for 3 years. At any time during this period, the EC or the Digital Services Coordinator may request it.

A crisis consequence mechanics has besides been introduced (Article 36). At the request of the EC, a very large online platform or search engine may be required to take circumstantial measures to prevent a major crisis occurring where "extraordinary circumstances lead to a threat to public safety or public wellness in or in a crucial part of the Union" (Article 36.2). The Commission can besides issue guidelines, recommendations and propose best practices.

In recital 83 of the DSA in relation to the assessment of systemic hazard by large platforms and search engines, attention was paid to systemic hazard including ‘coordinated public wellness misinformation campaigns’ and in recital 84 the importance of the hazard of dissemination, albeit legal but misleading or misleading content, including misinformation, was highlighted.

EU bodies place emphasis on self-regulation and the thorough enforcement of the "voluntary" commitments adopted by online platforms. This avoids accusations of straight imposing censorship or limiting public debate. In the context of the adoption of a Code of Conduct on Disinformation, a Code of Conduct on Platforms' Combating "Hate speech", the EC is gaining an instrument to enforce stricter rules.

Three dangerous legal mechanisms of the DSA

Already at the plan phase of the DSA Regulation, respective provisions have been introduced which are crucial for the revision of EU disinformation policy.

The first concerns Article 9, which requires online platforms to take "without undue delay" action against "illegal content" at the direction of "the applicable national judicial or administrative authorities".

The same legal mechanics has been created, which allows the representatives of the authorities to request the removal of ‘without undue delay’ content deemed to be ‘illegal’. As indicated above, the definition of "illegal content" is broad and can proceed to evolve for circumstantial countries.

The second provision is Article 16 on a mechanics for individuals to study content that they consider to be ‘illegal’. Upon notification, the hosting service supplier may decide to block them by acting “with due care” without having to carry out a “specific legal analysis”. The hosting service supplier must not only registry the application, take certain actions, but besides inform the applicant of the activities undertaken.

The 3rd provision is Article 22 concerning trusted signaling agents, which introduces "priority treatment of notifications made by trusted signaling entities, operating in designated areas where they have expertise (...), and the examination of specified notifications and decision-making of the case without undue delay". This legal standard allows precedence to be given to, for example, researchers of selected universities and selected fact-checking organisations, which are active in checking the reliability of information. specified organisations – as shown in the congressional investigations, including the Industrial Censorship Complex in the US – are active in projects to make a strategy of global censorship (cases before the American SN and the investigation of the home of Representatives' judicial commission).

National regulation

The Vetted Act implementing DSA on the Polish marketplace granted tremendous powers to the president of the Office of Electronic Communications (UKE), as well as to the National Broadcasting and tv Council (KRRiT), in the field of video platform supervision and to the president of the Office for Competition and Consumer Protection (UOKiK) in the field of the supervision of commercial platforms and another consumer protection matters.

The government has besides identified a catalogue of "online crime" (this is an online crime). The first draft law does not supply for a mechanics of judicial review – as even the DSA allows – but only administrative, revealing the actual intentions of the ruling team. Officials could remove content at an instant, giving the injured individual just 2 days to present his position. A removal order would be issued by UKE president and KRRiT Chairman. The decision would not be subject to appeal. In order to defend himself, the author would gotta bring action in court, bearing associated charges and any inconvenience.

Officials would only have 2 days to respond to a request to delete content, erstwhile a police officer or prosecutor would have done so and 7 days erstwhile the application would have been made by the recipient, a ‘trusted signaling agent’ or 21 days in cases of peculiarly complicated cases.

The president of UKE would be the coordinator of digital services in DSA-related matters at national and European level, and, together with the advisor of the National Digital Services Council, he would, among another things, consider complaints submitted on platforms and conduct administrative proceedings related to the EU Regulation. The bill was to enter into force only within 30 days of its announcement.

In the course of work in the Sejm, any changes were made, including that the injured individual would have the right to object to the court (Article 11q(2) to (3)). Within 14 days, the victim would have commanded before the court that he did not commit a violation of the law.

The president rightly noted that the granting of specified tremendous powers, for example to the president of the Office of Electronic Communications, as a ‘officer under the government’, who would decide what was illegal, resembles the censorship strategy described by George Orwell in his book ‘Year 1984’.

The veto bill provided for position
"trusted signaling agent", which would be given precedence by platforms, which would study "illegal content" and usage taxpayers' money to limit their freedom to, for example, hear another people's opinions and arguments on controversial topics.

Demagog deplores that he finds many examples of "illegal materials" in his regular work, which "despite their actions are inactive present on social networks". Therefore, "another notification mechanisms would service to combat illegal materials more effectively".

The veto besides regrets the Bureau of the Council of Polish Media, stressing that it will make it more hard to "fight disinformation on the Internet, at a time erstwhile almost all day brings more lies and half-truths from the east border", as well as "obstructed the fight against another highly harmful content distributed on the network, specified as calling for suicide or pedophilia praise". The President's veto is to be, in the opinion of this entity, a "lost chance to improve the safety of Poles".

The president rightly recalled that "limiting freedom of speech is unconstitutional", and "the exploitation of the interests of the youngest is immoral."

The government in 2026 is to focus on the fight against misinformation

Both the Minister of Digitalism Krzysztof Gawkowski and Prime Minister Donald Trump announced that they would not remainder in the fight against misinformation and would proceed to take action to limit freedom of speech for Poles. That is to say, freedom of expression, including judgments, opinions, conjectures, information about facts, but besides freedom to get and disseminate information – due to the fact that that actually means freedom of speech – and this under the pretext of fighting pedophilia, Russian disinformation and in order to guarantee safety on the network, whatever it truly means.

In December of last year Minister Gawkowski announced that the precedence of his ministry in 2026 would be the fight against misinformation, to which immense funds from the state budget are to be allocated. Among another things, an educational run and interministerial cooperation were planned to make a national framework for combating misinformation. On the basis of British, German and most of all American experiences during the Biden-Harris administration, it is de facto about building a "Industrial Censure Complex". This phenomenon has been described and documented by the American influential committee of the judiciary in its already respective mid-term reports from an investigation against the "Cenzury Industrial Complex", which includes many government agencies, university centres and fact-checking networks developing tools and fresh technologies that can censor large-scale communications. In the U.S., under the pretext of addressing critical threats to communication systems and “to combat false information/disinformation”, university centres have developed advanced censorship tools utilizing “artificial intelligence”. For example, the MIT programme aimed at conservatives, agrarian residents and war veterans and their families. All of them were to be "more susceptible to disinformation campaigns" due to the fact that they had greater assurance in the Constitution and the Bible, and preferred to come to the fact themselves, analyzing content, referring to "original sources", alternatively than trusting "professional consensus".

The Judicial Commission has branded, among others, the Stanford University net Observatory, Washington University, the Atlantic Council Crime investigation Laboratory, Graphika, and the fact-checking organisations' networks and a number of national agencies, as well as large Techy, which aimed to get any topics out of the public space to endanger democracy, “election integrity”, “information integrity” and policy making.

In the intention of Minister Gawkowski, the discipline and Academic Computer Network (NASK) is to play a large function in the fight against misinformation, which became celebrated for its disinformation run in 2025, for which it was condemned even by the Organisation for safety and Cooperation in Europe (OSCE).

Gawkowski besides announced that the ministry of digitization will focus on the improvement of artificial intelligence (AI) in Poland. A legal framework for AI is to be established and, as may be assumed, 1 of the applications of AI may be the enforcement of automated censorship.

Media reports besides show that the Gawkowski Ministry is working on a fresh draft law implementing the DSA, and it is most keen to supply powerful backing for "trusted signaling entities", i.e. networks of progressive fact-checking organisations.

After the President's veto, the EC threatened to sue. European Commission spokesperson Thomas Regnier reported that Eurocrats are looking into the situation in our country and "encouraged" Poland to adopt the missing rules as shortly as possible and to strengthen the powers of national digital service offices.

The EC besides launched a case against Poland in the summertime of the year, which, like the Czech Republic, Cyprus, Spain and Portugal, failed to fulfil its work to appoint a DSA coordinator. The case was referred to the EU Court of Justice on the pretext of failing to fulfil obligations.

Why is DSA so crucial to EU bodies?

The EU regulation is utilized to interfere with interior elections in the countries concerned. This act was utilized to nullify the Romanian presidential election in 2024 and prosecute the political opposition. Similarly, DSA is referred to in Germany to combat AfD influences. Most importantly, these provisions are designed to restrict access to certain information not only for EU citizens but besides for citizens in another countries around the world, including through the decisions of the TEU.

For this reason, American legislators declared war on the Digital Services Act. An additional argument for them is overburdening large Techs with content moderation requirements and requiring easier access to algorithms (transmitting know-how, for example, to trusted signaling agents).

Washington decided to impose sanctions on the erstwhile EU Commissioner and respective activists limiting freedom of speech in Europe. On 23 December 2025, the Secretary of State Marco Rubio issued a resolution under the Immigration and Citizenship Act (INA), banning the entry into the United States of 5 Europeans related to content moderation activities and ordering their deportation if found in the United States.

These 5 people, who the Secretary of State has provocatively referred to as part of the "Global Industrial Complex of Censuria", conducting "organised actions to force US platforms to censor, demonetize and suppress the U.S. views they oppose", guilty besides of "advanced censorship repression" were: the erstwhile French Commissioner and Vice-President of the EC, Thierry Breton, the creator of the EU Digital Services Act (DSA), which, during the 2024 U.S. presidential campaign, demanded the X platform chief, Elon Musk, not to supply Europeans with an interview with the applicant for a second term, Donald Trump. He even threatened the EC to take "retaliatory action" against X under the DSA. Breton resigned shortly after, reportedly under force from the president of Ursula von der Leyen.

Other individuals include activists of European NGOs (NGO) dealing with net censorship, namely Josephine Ballon and Anna-Lena von Hodenberg from the German organisation HateAid, Clare Melford from the British Global Disinformation Index, which is active in influencing platforms on the demonetization of websites allegedly linked to “disinformation and harmful content” and Imran Ahmed, who heads the British Digital hatred Prevention Centre (Center for Countering Digital Hate) and presently resides in the US. He filed a suit against the authorities in connection with the planned deportation.

Rubio’s decision was immediately responded by the EC “decimally condemning” the actions of the United States, and French president Emmanuel Macron even demands a tightening of the DSA regulation in consequence to attempts to “terrorize and coercion to undermine European digital sovereignty” (Americans began threatening Mistral due to their engagement in censorship). Similarly, German abroad Minister Johann Wadephul reacted nervously, declaring the actions of Americans to be “unacceptable”.

The home of Representatives Judiciary Commission headed by Jim Jordan has been conducting advanced proceedings against the Industrial Censure Complex for respective years, which involves European entities, as I am describing in the above-mentioned book, as well as in articles on PCh24.pl, presenting extended findings from hundreds of pages of mid-term reports of the committees.

The committee late recognised that European DSA regulation constitutes a "obvious threat to American freedom of speech", and "European regulatory authorities specify political statements, humor and another content protected by the First Amendment to the Constitution as "disinformation" and "talk of hatred", and then require platforms to change the global principles of content moderation to censor them".

The Commission recalled that the DSA, supplemented by authoritative codes of conduct on disinformation, demands that legal content be moderated which have "a negative impact on civilian debate, electoral processes and public security". The preamble to the DSA requires platforms to "pay peculiar attention" to misleading or fraudulent content specified as misinformation.

The European Commission is directing actions to guarantee that the DSA is respected by the platforms and applying powerful fines, which could scope 6% of the large Tech's global revenue. In December last year, she first imposed a fine of EUR 120 million on X, accusing the Musk platform that he had fraudulently designed his ‘blue selection mark’ for verified accounts, that the X advertising repository did not meet the transparency and availability requirements, and that X imposed inappropriate restrictions on external researchers' access to its data.

Probably Rubio's sanctions are a consequence to EC action on X.

It is worth noting that the DSA draws patterns from the 2017 German Law on Enforcement in Social Networks (NetzDG), which prohibits insulting politicians and state institutions, incitement to hatred (Volksverhetzung), insulting spiritual communities and glorifying the Holocaust. These provisions, dictated by negative experiences from Nazi times, are to be derived from the doctrine of “a fighting democracy”. The NetzDG Act has prompted social media platforms to remove more resources than is legally required. Platform moderators having doubts whether or not to delete the content do so in accordance with the “remove in case of doubt”.

Currently, the German national Ministry of Justice and Consumer Protection is to search to extend prison sentences for a crime of incitement to racial hatred, and to establish a ban on voters seeking electoral credentials for up to 5 years to fight the influence of the alternate organization for Germany (AfD) which is gaining popularity across the country.

In March last year, the EC published the "DSA Election Tool Set", which aims to support digital service coordinators in ensuring "election integrity". These tools include: (1) the management of stakeholders; (2) communication and media literacy; (3) monitoring and analysis of the hazard of elections; and (4) consequence to incidents. After a closer look at this "set", it is clear that the EC aims to make an "Industrial Censorship Complex" like the American one, which tries to dismantle the current Trump administration.

In November 2025 Brussels presented the European Shield of Democracy, a plan to strengthen the EU's capacity to fight abroad misinformation and political interference (FIMI), whose enforcement is based on the Digital Services Act (DSA) of 2023.

This is the flagship thought of the chief of the Ursula von der Leyen Commission. EU regulation of Law Commissioner Michael McGrath pointed out that "the investments and actions we are now taking will find the condition of our democracy and the stableness of our societies for the next generation of European citizens."

The plan is mostly based on the enforcement of DSA and the usage of alleged artificial intelligence to manage disinformation online.

The Commission is to establish a DSA protocol on incidents and crises to better coordinate the activities of authorities during large-scale operations, while working with partner platforms to show misinformation.

To straight combat interference in elections, associate States and the EC will update the DSA Election Tools Set. The European Network for Election Cooperation will be established and a plan will be drawn up to counter abroad manipulation and misinformation. Finally, the EU budget and the associate States will finance the European Network of Facts Reviewers.

EU officials shall guarantee that these initiatives are not intended to make a “ministery of truth” or “content control” but simply “ensure transparency and enable democratic debate in an environment where people know where the information comes from”.

Ultimately, the European Centre for the opposition of Democracy is to be established to coordinate cooperation on better anticipation and management of democratic threats, that is, disinformation during elections.

"It is essential to strengthen coordination, to reduce fragmentation and to guarantee that all opportunities and expertise are pooled," said 1 of the EC officials, promising to support journalists through fresh legal safeguards and the Media Resistant Programme.

Citizens are to be "educated" (read affected and manipulated) in the field of misinformation recognition.

Regardless of these initiatives, we already know that various progressive entities, together with authorities and alleged fact verifiers, and scholars from various American, British or German universities, have illegally censored and censored online statements. This was shown by reports from the U.S. Congressional Judiciary Committee. Similarly, left-wing German politicians utilized "HateAid" to censor their enemies and then lied about it. In October last year, the German president awarded Hodenberg the national Cross of Merit for "enhancement of democratic values on the Internet" and "action at EU level for a safe and democratic Internet". In June last year, HateAid received the position of a ‘trusted signaling entity’ under the DSA, becoming a circumstantial ‘on-line spread’.

The EU's DSA regulation delegates censorship to "trusty whistleblowers", non-responsible entities that operate on their own political or ideological objectives and in addition receive our money for their activities.

In the U.S., investigations have shown that Twitter and Facebook have created peculiar “portals” for government-funded NGOs so that they can “mark” posts that they wanted to censor. These organizations employed erstwhile military and intelligence workers among others, aiming for mass censorship and promoting certain narratives, and suppressing others. For example, “The Stanford Group helped the U.S. government in covid censorship and then lied about it, as the papers showed.

Brazil and another countries require social media platforms to make user data available to non-governmental organisations selected by government agencies to request "content moderation" or censorship on the basis of these data, deamplification, i.e. limited visibility and coverage of materials.

Consistently "previous" censors fight "populism" and "anti-globalism".

Recently Deputy Minister of discipline and Higher Education Prof. Andrzej Szeptycki announced a fresh initiative, together with the University of Warsaw, to combat “scientific disinformation”. The task worth at least PLN respective million is to fight misinformation on vaccination, climate change theory, etc., utilizing celebrities, among others.

In January 2022, the British technological association of the Royal Society, 1 of the oldest in the planet suggested that censorship on the web even harmful technological news was not a good solution.

The Society has spoken out against the British bill on online security, which focuses on the harm to individual information, without noticing wider "social obstacles". Disinformation on technological issues, from vaccine safety to climate change, can harm individuals and society. However, as highlighted, forcing platforms to remove certain content can lead to an even greater crisis of assurance in power and institutions.

Professor Frank Kelly, a Cambridge University mathematician noted that “science balances on the edge of error” and that “there is always uncertainty.” And if it is claimed, as in the early days of the "pandemic", that the "science" in a given issue is "absolute", certain, and then there will be corrections of earlier findings, people will cease to believe the scientists. Meanwhile, constantly investigating the truth, checking, investigating any "wisdom" is “an integral part of the advancement of discipline and society.”

There is no uncertainty that progressive environments, which will consistently lose importance in the coming years – which suggests more forecasts – will do everything in their power to introduce censorship. That's how they want to defend their position.

They aim to implement the strategy of ideology, which was well described by Professors Marek Bankowicz and Wiesław Kozub-Ciembronewicz in the survey “Dictatures and Tyranos. Sketches about undemocratic power," stressing that totalitarian ideology "does not halt with formulating indications of a purely political nature". "Mystic totalitarian ideology has many features of par excellence of spiritual faith. Over time, this kind of ideology has increasingly avoided reality, fleeing into a planet of fiction created by it. Totalitarianism, therefore, is always more or little an ideology, or a strategy in which ideological superreality displaces reality. The existing planet and its concepts are stifled by circumstantial fiction, while words lose their conventional meaning and get a fresh one, turning into a fresh one."

In this system, only a fewer have cognition of the rules governing the order of the world, and as a fresh “elita” they effort to lead the people, imposing rules of conduct on passive masses. Eventually, everyone will learn to live a lie. That's what censorship is all about.

Agnieszka Stelmach

Read Entire Article