Homomorphism – A Beginner’s Guide

Homomorphism is our next step in understanding FHE. If you’re just starting to learn about Homomorphic Encryption, you might be daunted by the amount of abstract math. You should be! A lot of great minds put a lot of effort into making these schemes both secure and performant. In this post, we would like to introduce the concept of homomorphisms in simplified terms. If you can get to the end of the post, you should be able to have a good understanding of what homomorphisms allow, what they don’t, and how we use them. This Blog was inspired and written by Noam Kleinburd, Director of Cryptography in Chain Reaction

Homomorphism – Basic Definitions

We’re going to assume that you, the reader, have some background in set theory, group theory, and know the basics of modular arithmetic.

Given two groups Homomorphism - starting with two groups ,

a homomorphism is a function h from S1 to S2 that preserves the following property: Homomorphism Function.

If you’re already feeling lost, don’t worry! Let’s go over this again with a concrete example:

First, we’ll use integers as our sets: Homomorphism - Example .

For our group operation, we’ll use addition: Homomorphism - Example

Finally, our homomorphism is a simple multiplication by two:

Homomorphism - Example

It’s easy to see that Homomorphism - Example .

Multiplication, however, isn’t preserved: Homomorphism - Example .

Some cryptographic schemes let you perform one operation; they’re called partially homomorphic. If the scheme can perform both addition and multiplication, it’s called fully homomorphic.

In the next sections we’ll look at some partially homomorphic schemes.

RSA

RSA is a well-known cryptosystem publicized by Rivest, Shamir, and Adleman in 1977. It is asymmetric, which means one party generates a private key and public key. The public key is generated by picking two large primes and multiplying them:
RSA - Private Key Generation The public key also contains a number e, which is used to encrypt messages: pk = (N, e).  The construction of the private key is slightly more complex. We use the multiplication of p-1 and q-1: RSA - Private Key

and calculate RSA - private key

This lets us define encryption and decryption:

Encryption and Decryption Definition

We won’t go into the details of why decryption is the inverse of encryption, or security aspects of this scheme.

Note: This is a description of “textbook” RSA. It is not secure in all cases, and you definitely shouldn’t implement it yourself. Use well-established libraries for cryptography whenever possible.

Notice that the sets we’re using are numbers modulo N, often denoted: numbers modulo N

Let’s see what happens when we multiply two ciphertexts: Homomorphism - multiplying two ciphertexts

Success! We can use RSA to multiply encrypted numbers. However, it’s important to remember that this multiplication is modulo N. If your original numbers are large an overflow may occur, and we can’t add numbers or divide with flooring.

Paillier

This next example is a little more complex and has some very interesting properties. To keep things simple, we’ll modify the scheme to be less secure, so don’t implement this yourself either.

The public key is generated in a similar way as RSA: pick large primes p and q (of equal length) and multiply them.

Paillier - Public Key Generation.

The private key is calculated as:

Paillier - Private Key

To encrypt a message m, we should pick a random number 0 < r < N , but we’ll use the random number 1 (chosen by fair dice roll). Then compute the ciphertext as

Homomorphism - Compute ciphertext.

The decryption formula is long and scary, so we’ll leave it out of this post,

(for those of you who are feeling brave: ).

Let’s use the encryption function and compute on some ciphertexts:

This is something new. We multiplied our ciphertexts, and it resulted in adding our messages.

As an exercise, try adding the random number r to the encryption, and see that the homomorphic property still applies.

Conclusion

These are real cryptosystems that are used in our day-to-day life. Each of them is partially homomorphic, which allows some computations to be made, but neither of them allows both addition and multiplication of messages.

 

How FHE Works: A Breakdown Of The Mathematical Foundations

It is probably fair to say that Fully Homomorphic Encryption (FHE) is a relatively new and groundbreaking development because of its complicated nature. A practical FHE implementation stands upon the shoulders of several mathematical foundations and scientific principles that have only been brought together in the last 20 years.

This blog post will seek to unravel FHE in a way that shines a clear light on its building blocks. On our way, we will explain each technology’s supporting pillars. Thus, we shall talk through key concepts such as homomorphism, advanced number theory, lattice-based cryptography, computational complexity theory, and more.

FHE Foundations, Turn-By-Turn

Homomorphism

Homomorphism could be called the backbone of FHE, and aptly it is represented by the ‘H’ in the center of the acronym. It is a concept within abstract algebra which refers to a structure-preserving map between two algebraic structures. In the context of FHE, this means that operations (like addition and multiplication) performed on encrypted data yield the same result as if they had been carried out on unencrypted plaintexts, then encrypted.

This fundamental idea enables FHE schemes to process data without ever exposing the raw input. It’s not just a mathematical curiosity—it’s a crucial requirement for privacy-preserving computation.

Number Theory

FHE is deeply rooted in number theory, and particularly the use of modular arithmetic, polynomial rings, and prime fields. These constructs are essential for encoding and manipulating data in ciphertext form.

Many FHE schemes operate over rings of polynomials with coefficients modulo a prime or integer, such as

FHE - Mathematical Foundations

The algebraic properties of these rings allow FHE systems to implement encrypted operations efficiently while controlling noise growth.

Number theory also underpins ciphertext packing, bootstrapping transformations, and the secure construction of key-switching (such as relinearization). Without this layer of math, FHE wouldn’t be functional or efficient.

Lattice-Based Cryptography

Lattice-based cryptography provides the security foundations for most modern FHE schemes. A lattice is a grid-like structure of points in high-dimensional space. Cryptographic hardness is derived from the difficulty of solving lattice-based problems like:

The above problems are considered hard, even for quantum computers, making lattice-based encryption schemes quantum-resistant. They also enable compact ciphertexts, embrace support for parallelizable operations, and deliver provable security guarantees, all of which are critical for scaling FHE in real-world applications.

Computational Complexity Theory

This area of computer science evaluates how difficult it is to solve certain problems, particularly with regard to the time and resources required. FHE’s security is based on asymmetry in complexity: operations like encryption and decryption are easy (polynomial-time), but attempting to reverse encryption without the secret key (i.e., solving the underlying lattice problems) is infeasible, believed to require super-polynomial or exponential computation time.

Moreover, FHE schemes must carefully balance computational overhead and noise management. Homomorphic operations accumulate noise in ciphertexts, and bootstrapping (a refresh step) is expensive. Understanding this complex landscape allows system designers to optimize performance and ensure security margins are preserved.

FHE: Bringing It All Together

FHE is not a monolithic concept, but rather an intricate synthesis of multiple domains of math and computer science.

These foundations work in tandem to create what was once considered impossible: performing arbitrary computation on encrypted data while it remains confidential.

In Summary

Being able to process encrypted data without it ever being decrypted yet attaining the same results as if the processing had been carried out on plain texts sounds rather magical. However, we know that it works in the real-world and this experience is backed up by mathematical proof and scientific theory, as outlined above.

Next up, we will be publishing a blog post about the computational cost and other real-world challenges of implementing FHE, and how these will be overcome in the not-too-distant future.

 

Fully Homomorphic Encryption – A Deep Dive into the Basics

Fully Homomorphic Encryption (FHE) is a cryptographic technique that is considered the holy grail of cryptography as it allows for analytical functions to be run directly on encrypted data. Thus, FHE has the potential to unlock huge untapped information resources – encrypted for privacy – and its real-world rollout is expected to deliver enormous benefits in fields like healthcare, industry, and commerce, without compromising sensitive or private data.

At the time of writing, FHE is becoming increasingly practical, with companies offering pure-software solutions for specific applications. However, evaluating complex functions such as AI models, large-scale database queries or high-performance computing applications remain impractical due to computational overhead. However, Chain Reaction is well on the way to solving this issue with the development of an ASIC that accelerates FHE computations by orders of magnitude, enabling real-time deployment. 3PU™, Chain Reaction’s privacy processor is designed to enable real-time FHE deployment in cloud environments and AI models, ensuring continuous data privacy.

FHE Diagram

The Concept of FHE

FHE, The Core Concept

Data security tools and practices have advanced significantly in recent years. Yet despite this progress, data leaks and breaches continue to occur at alarming rates. For example, in June 2025, UBS confirmed that a cyberattack on its suppliers resulted in the leak of thousands of employees records, and according to Privacy Rights Clearinghouse, the first quarter of 2025 alone saw 658 distinct breaches affecting more than 32 million people.

While organizations are locking down private data, either at the enterprise level or on end-user devices, the unintended consequence is that valuable private datasets remain inaccessible. This limits the ability to extract insights and generate societal value from them.

What if researchers and data-driven organizations could use securely locked data without exposing any sensitive or personal information? With FHE, private data remains encrypted at all stages (at rest, in transit, and in use) while still allowing computations that generate insights, accessible only to those with the private key.

The History Of FHE

The underlying cryptographic technology is based on a concept called ‘privacy homomorphism’ which first gained traction in academic circles in the early 1970s.

Still in the pre-FHE era, in 1977, the foundation of the RSA cryptosystem, named after developers Ron Rivest, Adi Shamir, and Leonard Adleman, would become a pivotal moment in the timeline. The RSA cryptosystem was the first to exhibit partially homomorphic encryption, where “multiplying two encrypted with the same key is equivalent to raising the product of the plaintexts to the power of the secret key,” sums up IEEE.org.

A year later, Rivest and Adleman proposed, alongside Mike Dertouzos, that homomorphic encryption could be used to protect the security of stored data. This became a key driver to the development of FHE.

In 2009, there was a significant breakthrough on the road to FHE with the arrival of the first secure encryption scheme which allowed for unbounded addition and multiplication operations on encrypted data. Craig Gentry spearheaded this development by applying lattice-based cryptography. Gentry, alongside other researchers and organizations would quickly build advances to shift FHE from concept to practicality.

Since Gentry’s milestone breakthrough we have seen several FHE schemes developed, each with its own strengths and weaknesses. Later in our blog series you will be able to deep dive into the BGV (where the ‘G’ is for Gentry), BFV, CKKS, and TFHE schemes, and read our analysis of their features, and applications.

Probably the only tradeoff in the move from HE to FHE was in the significant computational overhead. This would become an even greater hurdle as we moved into the era of big data.

The Significance Of FHE In Modern Cryptography

Readers were introduced to some of the most significant benefits FHE can deliver to both individuals and organizations in our introduction. However, with researchers, governments, and the tech industry discovering the promise of big data in the 2010s, and the potential seen in using large data sets for machine learning and AI in the 2020s, FHE’s power to revolutionize data privacy could now be even more important for modern cryptography.

What Operations Are Possible With FHE?

Talking in mathematical terms, the latest generation FHE technologies facilitate the use of whatever polynomial expression a researcher might be interested in applying to data sets. This means expressions with variables and operands such as addition, subtraction, multiplication, and exponentiation can work on the encrypted data as if it was plain text. This flexibility opens an abundance of analytical possibilities.

How To Overcome Limitations Posed By FHE

A brief list of factors hindering the rollout of FHE today would include performance overheads, the lack of standardization, and the limited industry adoption of this cryptographic technology. These are typical of the kinds of obstacles faced by nascent technologies

In our view, most of the hurdles outlined above are connected. As well-crafted FHE hardware and software solutions advance, emerging standards, such as those in the making by FHETCH, will help pave the way to broader adoption of FHE.

On performance, this is precisely where we have seen the application of Application-Specific Integrated Circuits (ASICs) shift the industry paradigm. It is therefore with some hope that we expect to see a major barrier to FHE adoption broken down with the arrival of the first practical, fast, and efficient FHE-targeted ASIC hardware in the coming months.

Research projects and early applications from hyperscalers such as Google, Meta, Amazon, Microsoft and apple suggest a strong strategic interest in FHE. These organizations are actively exploring FHE and are expected to scale adoption as dedicated FHE accelerators become available.

Importantly, it is expected that large organizations like governments and health agencies will gain the confidence to move forward, too, implementing proven FHE technology to make the best use of the valuable data they hold.

In Summary

Thank you for reading our introduction to the basics of Fully Homomorphic Encryption (FHE) technology. Above, we’ve recounted the concept behind FHE, summed up its early development history, briefly discussed why FHE is the Holy Grail of cryptography, and outlined why large-scale, real-time deployments of FHE have yet to materialize, despite growing interest and early non–real-time implementations.Please stay tuned, though, as this is just the beginning of our FHE technical blog series. Coming up we hope to: discuss the mathematical foundations behind the technology, highlight the current performance issues and performance bottleneck seen in applying FHE, compare the different FHE schemes that are available, and much more.

UN Report Shows Why Real-Time Encryption Matters for Ethical AI

The United Nations recently released its Governing AI for Humanity report, outlining its vision for responsible, fair, and globally beneficial AI governance. The report underscores the need for AI that respects human rights, ethical standards, and international law, protecting vulnerable groups and ensuring that AI development benefits all of humanity. It advocates for a global, inclusive approach to AI governance to prevent power concentration, bridge digital divides, and incorporate diverse perspectives, especially from underrepresented regions.

Data Privacy in AI

One of the primary areas of interest in the report is that of governance with regard to privacy and data security. This includes a call for an international framework for AI training data that promotes privacy and interoperability across jurisdictions, supporting transparency, accountability and set standards for data ownership and use.

The report warns against potential exploitation of data in competitive markets and advocates for a “race to the top” in which governments, corporations, and public trusts collaborate to empower AI through ethical data usage. It encourages international standards to prevent a decline in privacy protections across regions due to competitive pressures.

Perhaps most importantly, the report recommends adopting privacy-enhancing technologies (PETs) to enable secure data processing without compromising individual privacy.  In so doing, the UN recognizes how critical it is to enable secure, encrypted data processing to prevent misuse and build public trust in AI systems, enabling them grow responsibly and securely, thereby providing even greater benefit.

FHE Unlocks AI’s Potential

A dedicated privacy processor using Fully Homomorphic Encryption (FHE), such as Chain Reaction’s 3PU™, could significantly enhance trust in AI systems. By enabling the analysis of encrypted data without decryption, it ensures sensitive information remains secure throughout processing. This prevents data misuse by eliminating exposure of plain text data during analysis. This layer of protection is valuable for all private data, and especially for vulnerable groups who are at a higher risk of data exploitation.

With FHE-based processing, governments, organizations, companies and more can unlock AI’s full potential while  maintaining individuals’ control over their personal information, aligning with the report’s emphasis on privacy-preserving AI governance. This technology would make it possible to utilize AI for social good without compromising privacy, accelerating the UN’s race to the top and achieving the goal of responsible, ethical, and globally beneficial AI.

 

Medical Privacy in the Search for a Cure for Cancer

In today’s digital landscape, privacy has become a top concern. With data breaches making headlines, the fear of losing control over our personal information is stronger than ever. The recent US Healthcare data breach, affecting 100 million people, is a daunting example of our vulnerability: our medical records, passwords, financial data, and identity – are all at risk.

This is even more true when it comes to protecting our medical and biometric data, perhaps because unlike financial information, which can be protected or, if necessary, reset, one’s biometric data is indelible and permanent, making its protection vital. While governments have taken massive steps forward to ensure our medical privacy through regulation, such efforts do not address the technological advancements that threaten our personal data.

The other side of that technology offers promise for a better world. The combination of high-performance computing and artificial intelligence (AI) has advanced analytical capabilities to enable solutions and treatments for many of today’s most difficult medical challenges. The biggest impediment to such analytical breakthroughs is a lack of real-world data.

The Promise of AI for Healthcare Innovation

AI has the potential to revolutionize healthcare, but it needs vast amounts of data to uncover patterns that indicate diseases early in the process. Given enough information, AI can identify subtle warning signs for cancer or heart disease. For instance, if AI analytics are applied to isolate a pattern of test results that indicates early onset of heart disease, researchers can then watch for that pattern and possibly provide preventative medications to slow or even avoid that onset.

In another example, Google uses AI to analyze DNA samples for specific genomic patterns that indicate a likelihood of developing a certain type of cancer. The NIH uses AI to analyze DNA to link the best medications to the patient’s specific genetic makeup for optimal results. With enough data to work with, it won’t be long before AI could very well identify an actual cure for many cancers.

This leads to a critical question: how can we harness the power of AI for medical advancements without compromising our privacy? To fuel these advancements, AI requires vast amounts of real-world data. But sharing medical information creates an immense risk to our privacy. A new technology is needed – one that protects personal data while enabling incredible medical innovations.

The Real Danger to Our Data Privacy

The single greatest threat to our privacy is data breaches on unencrypted data. While hackers are adept at accessing secure servers, they are mostly powerless against encryption, which keep the data within those servers securely protected. Data breaches and leaks are problematic because data was either stored, transferred, or processed while unencrypted.

While the solution might seem as simple as applying encryption to data at all stages (at rest, in transit, in use), the truth is that today it is impossible to process encrypted data at scale.  Therefore, any medical data that you share for processing, whether with a lab for analysis, an insurance company to receive a quote, or an academic institute for medical research, is highly likely to be unencrypted at some point to enable it to be processed.

The Case for Fully-Homomorphic Encryption

One technology that has been developed to address exactly this issue is Fully-Homomorphic Encryption (FHE). FHE is a cryptographic operation that enables data to be processed without it ever needing to be unencrypted, comparable to a locked vault in which data can be analyzed but not accessed. That means that the privacy of the data is preserved no matter who uses it and no matter what they are using it for.

Suppose, for instance, that an AI machine learning model needs millions of patient records to accurately identify the genetic patterns of a certain type of cancer. Today, providing such information would require patient consent, reducing the likelihood of acquiring enough records for an accurate analysis. It would also be impossible today, using traditional encryption methods, to access and analyze these records without having to decrypt them. That would mean that patient identities and medical records are vulnerable.

With FHE, though, data remains encrypted at all times and at every stage, eliminating the need for the cumbersome patient consent process. Encrypted medical information, even if it does end up in the wrong hands, is gibberish unless it is decrypted. The benefit is twofold: it accelerates medical innovation by providing secure access to valuable data while ensuring that patient privacy is maintained.

The Challenge of Applying FHE at Scale

While FHE already exists and can be applied on a small scale for specific use cases, it cannot yet be implemented at the scale required to enable such medical breakthroughs. This is because there is significant computational overhead involved in the complex cryptographic calculations that are required for processing encrypted data. It is estimated that applying FHE to a cloud-based AI model for analyzing millions of records would require nearly a million times the processing power of today’s processors.

That is why much effort is being made by both government-funded projects, such as DARPA’s DPRIVE, and private corporate efforts, such as  Chain Reaction’s 3PU™ privacy processor, to develop a hardware-based accelerator that can implement FHE at scale. Once the processor exists to overcome the computational overhead, the possibilities for privacy-safe medical advancement are endless.

Once FHE is adopted, all our privacy concerns will be assuaged. Our personal data will remain encrypted at all times, unable to be accessed even if it were to be hacked or leaked. And just as importantly, by removing the privacy hurdle, we will open a new world of medical research and innovation, enabling us to live both healthier and more securely.

Biometrics Privacy in the Cloud Era, Part 2

Biometrics continue to be a hot news topic, especially as the technology penetrates ever-further into our lives and privacy concerns escalate. As we discussed in Part 1 of this two-part series of tech shorts, biometrics has moved beyond our smartphones and into the cloud, raising questions about the ability of the various applications to safeguard our personal data when it is (necessarily) unencrypted while being processed.

The growing prevalence of biometrics has led to concerns about their use in payment apps (such as Amazon One) and travel authentications (such as the TSA’s Touchless Identity Solution). Another recent story involves the questionable use of biometrics for security purposes at sports arenas. But the next generation of biometrics goes even deeper into our private makeup, demanding a technological solution to protect our personal data.

Next-Gen Biometrics Challenge Privacy

For example, the recent collapse of the popular DNA testing company 23andMe has brought to the forefront the issue of who owns our private data, in this case our genetic code. Even before its financial problems, the company faced a massive data breach, failing in its responsibility to safeguard sensitive data, and its imminent dissolution then led to accusations that it was selling off DNA data to the highest bidder in order to stay afloat. The recent news of the company’s bankruptcy has also caused speculation that the data will be used to pay off debt. . The recent news of the company’s bankruptcy has also led to speculation that the data will be used to pay off debt.

Another recent example is that of Worldcoin. The founder of OpenAI, Sam Altman, is trying to implement a groundbreaking way of replacing physical identification (eventually overcoming the scourge of AI bots and fake identities) by creating a worldwide database of retinal scans.  To incentivize people to participate, each volunteer is rewarded with 25 Worldcoins. Worldcoin is an Ethereum-based cryptocurrency that currently has a market value of about $2 US.

While this project is revolutionary, the question of ownership of the scanned data, the ramifications of a breach of security, and the legality of potentially profiting from the collection of biometric data all must be addressed.

FHE Can Overcome the Concerns

One way to ensure that private data remains private is through Fully Homomorphic Encryption (FHE). FHE allows data to be processed while it remains encrypted, such that ownership is no longer as much of an issue. Whether the data remains with the original collector, is sold elsewhere, or even is hacked, it remains always encrypted and, while the data can be processed and used, it cannot be accessed to violate the privacy of the individual who supplied it.

Certainly, there are still issues with biometrics that must be addressed through legislation and regulation, and in many cases, courts will determine whether these companies are acting within the bounds of fair play. But at least with FHE we can rest assured that our personal biometric data will be kept out of the hands of nefarious individuals and our privacy will be secured.