In 1953 James Watson and Francis Crick built the first accurate model of DNA, one of the great scientific advances of all time. From this point over, the century was characterized by the “genetic” revolution, a gene-centered era. Almost 50 years later with the completion of the Human Genome Project, usually referred to as 14 April 2003, the genomic era began. The public availability of the human genome sequence did not provide instant knowledge but thanks to it, more and more aspects of human existence have become the subject of scientific and technological analysis.
The knowledge of the human genome sequence represented a tremendous improvement in our understanding of the code of life and therefore created great expectations regarding the diagnostic and therapeutic potential in this field.
Before the advent of the “genomic” era in which we find ourselves today, genetic variants were divided into two main categories: mutations and polymorphisms, both contributing to the genetic variability that makes each individual unique.
The term mutation indicates a point difference an individual might have compared to the reference genome that can lead to dysfunctions of a gene and result in pathological phenotypes. On the contrary, the term polymorphism indicates an event that leaves the functionality of a gene unaltered.
Until a few years ago the distinction between mutations and polymorphism was mainly based on their frequency across the general population.
Polymorphisms were usually referred as DNA variations with minimum or zero phenotypic effects and which frequencies in the general population were higher than 1%. Mutations were instead considered rare events originating phenotypes associated with diseases and which frequencies were lower that 0.1% in the general population.
In recent years, however, the enormous development of large-scale sequencing techniques has made it possible to analyze, in whole or in part, the genomes of thousands of different individuals. This brought to light the fragility of the too simple distinction between mutations and polymorphisms.
In fact, it has been found that variants involved in some pathologies can have a frequency even greater than 1% across a specific population, while rare variants are not necessarily associated with a pathology.
Consequently, the most recent guidelines for human geneticists, the ACMG/AMP guideline, suggest eliminating the distinction between polymorphisms and mutations based only on frequency, in favor of a more detailed definition of genetic variants that distinguishes variants in “pathogenic”, “probably pathogenic”, “of uncertain significance”, “probably benign” and “benign”, depending on their effect on the phenotype.
From a simple distinction between mutations and polymorphisms we have moved to a detailed classification system that distinguishes between different levels of pathogenicity based on 28 rules that evaluate different types of evidence at variant and phenotypic level.
However the interpretation of variants according to ACMG/AMP guidelines is nowadays the most complex task because it is extremely susceptible to the personal judgments especially of geneticists that perform the analyses. For example, even when a genetic test is able to detect a pathological variant that may cause a mendelian disease, the interpretation of the results is complicated by the evaluation of the type of penetrance (complete or incomplete) and expressiveness of that variant.
Software that interprets genomic variants are therefore necessary to support geneticists to speed up and facilitate the analysis. This means more accuracy, speed, and minimizing the geneticist’s subjectivity.
We are strongly committed to improving the interpretation process and our software eVai was conceived to do that. By combining Artificial Intelligence with ACMG guidelines, eVai accurately classifies and prioritizes every genomic variant, suggesting a list of possible related genetic diagnosis.
There is still a long way to understand the role all genetic variants play in the human genome. Sometimes even the interpretation performed with powerful software is not enough to make a final diagnosis but embracing the digital genomic era will result in more knowledge, fewer errors and more accuracy. Are you ready for that?