A Security & Privacy Focused Phone with a Secure Supply Chain
The Liberty Phone retains the software security and privacy features of the Librem 5 while adding a transparent, secure supply chain with manufacturing in the USA. The Liberty Phone also has 4GB of memory and 128GB built-in storage.
Starting at $1,999
A Powerful Tablet with Freedom in Mind
Powerful 4 cores, tablet with AMOLED display, 4096 pressure levels pen and a detachable keyboard. It will let you express your creativity anywhere, anytime.
Shipping with PureBoot (Coreboot +Heads) and PureOS, you make sure that the Librem 11 is fully yours and is respecting your Privacy, Security and Freedom by default.
Starting at $999
A Security & Privacy Focused Phone
The Librem 5 is the original Linux kernel based phone produced by Purism with 3GB of memory and 32GB of storage.
Starting at $799.
Powerful 6 cores, ultra-portable laptop designed chip-by-chip, line-by-line, to respect your rights to privacy, security, and freedom.
Starting at $1370
Smaller than a Mac Mini, slightly bigger than a Raspberry Pi. More freedom, more privacy, more security.
Starting at $799
Privacy-focused cellular plan for the Librem 5 and other unlocked phones.
Starting at $39/month
The European Union is making bold moves to reshape the digital landscape—and if you're a consumer, developer, or platform operator, you’ll want to pay attention. Two major regulatory shifts are now underway: the proposed Digital Fairness Act and the activation of key provisions in the EU AI Act. Together, they signal a new era of transparency, accountability, and ethical design in the digital economy.
The promise of AI is seductive: instant answers, personalized insights, and a frictionless interface with the digital world. But beneath the surface of convenience lies a growing privacy crisis—one that’s now impossible to ignore.
The proliferation of AI models across consumer platforms has ushered in a new era of convenience—but it’s also accelerated the erosion of personal privacy. Large language models (LLMs) are trained on staggering volumes of data, including publicly available content and, in some cases, personally identifiable information (PII). That means sensitive metadata—everything from search history and location trails to voice recordings and biometric markers—can be folded into systems that behave like omniscient assistants, but without full user transparency or consent. In the monolithic culture of big tech, “innovation” often comes at the cost of ethical boundaries.