Blog

Your blog category

Blog

The Power of Belief

Home About Blogs Enlightenment Inventions Photography Press Writings Contact X The Power of Belief By Niazi April 21st, 2024 0 Comments Discussion Life is our most precious asset; taking life of others or giving our own for the sake of others is the final frontier of our expression of belief. Before we kill another human, or for that matter, an animal, our mind had fully justified the killing; just before we commit suicide, whether for a cause or to relieve our suffering, our mind had fully justified the act. The 9/11 hijackers believed in: “Let those (believers) who sell the life of ‘tis world for the hereafter fight in the cause of Allah and whoso fights in the cause of Allah—and is slain or gets victory. We shall bestow on him a great reward.” (Qur’an)[1]. The carnage that followed was massive and took 3,121 lives but there were 19 who laid their lives as well believing in what the Qur’an says. The belief has remarkable power, unparalleled by any other human expression. It was early spring of 1941 when Hitler called Himmler to his office; Himmler sat at the desk of Hitler’s secretary when Hitler announced his decision to eliminate Jews; Himmler put his head in his hands and said: “My God, my God, what I am expected to do?” The end result was that almost two-thirds of the 9 million Jews in Europe at that time were killed including one million children. If we add to this counts the homosexuals and other prisoners of war that were also executed by the Nazis, the toll rise to over 10 million people. Hitler believed that those subscribed to Judaic religion and other different lifestyles are a menace to the world and must be eliminated. In ‘Hitler Ascent 1889-1939’ by Volker Ullrich[2], we see how did Adolf Hitler — described by an eminent magazine editor in 1930 as a “half-insane rascal,” a “pathetic dunderhead,” a “nowhere fool,” a “big mouth” — rose to power in the land of Goethe and Beethoven? What persuaded millions of ordinary Germans to embrace him and his doctrine of hatred? How did this “most unlikely pretender to high state office” achieve absolute power in a once democratic country and set it on a course of monstrous horror?’ So, we see that it was not Hitler alone, but all those who allowed him to gain the supreme power to inflict this atrocity, are equally to be blamed; belief has so many sides. When he died in 1924, Stalin had earned a place in history as one of the most ruthless rulers in the history. His style was different than the direct killing by Hitler; he triggered famine in Ukraine to destroy those asking for freedom; over 7 million perished. Thousands of intellectual were killed with point blank bullets accused of up rise against the regime. He believed that human spirit can and should be crushed by force. Stalin: The First In-depth Biography Based on Explosive New Documents from Russia’s Secret Archives by Edvard Radzinsky[3] paints a picture of the Soviet strongman as more calculating, ruthless, and blood-crazed than has ever been described or imagined. Stalin was a man for whom power was all, terror a useful weapon, and deceit a constant companion. Stalin’s script was stolen unashamedly by Ayataullah Ruholla Khomeini, the religious ruler who came to power on 19 July 1988, as began killing the majority of the supporters of the People’s Mujahedin of Iran, the Fedaian and the Tudeh Party of Iran (Communist Party), as well as the intellectuals, including the past prime minister of Iran. Khomeini believed that those who do not believe in his values of religion should be eliminated to preserve religion. He destroyed a beautiful country. In his famous Little Green Book [4] he states: Jihad means the conquest of all non-Muslim territories. Such a war may well be declared after the formation of an Islamic government worthy of that name, at the direction of the Imam or under his orders. It will then be the duty of every able-bodied adult male to volunteer for this war of conquest, the final aim of which is to put Qur’anic law in power from one end of the earth to the other. But the whole world should understand that the universal supremacy of Islam is considerably different from the hegemony of other conquerors. It is therefore necessary for the Islamic government first to be created under the authority of the Imam in order that he may undertake this conquest, which will be distinguishable from all other wars of conquest, which are unjust and tyrannical and disregard the moral and civilizing principles of Islam. That is how he justified his acts. This book is full of minutiae of how we should live day by day that it almost appears comical, yet millions hold this book in the highest esteem, no different than the Mao’s Red Book. It all began in 2003 and continues today, when a group of government agents funded by Arab militia, loosely translated as “devils on the horseback” began killing the populace of Darfur; they destroyed entire villages, murdering and raping as they tortured their own kind to tune of more than half a million and displacing over 3 million. The Sudanese rulers believed that force is the only solution to motivate people into submission. In Gerard Prunier’s book[5] we see why the conflict in Darfur qualified as genocide? He questions whether Darfur isn’t horrible enough to justify a claim on the world’s conscience? The ethnic cleansing started by Hitler continued to raise its ugly head as we see the genocide of Bosnian Muslims at Srebrenica and Zepa by Bosnian Serb forces in 1994. General Mladic believed that ethnic cleansing is the only way to purify a nation. From 1993-1995, millions of Bosnian Muslims were killed. It was not until the summer of 1995 that the US decided to intervene that the genocide came to an end. [6] The Geneva Centre for the Democratic Control of

Blog

The Sweet Tooth Syndrome

Home About Blogs Enlightenment Inventions Photography Press Writings Contact X The Sweet Tooth Syndrome By Niazi April 21st, 2024 0 Comments Discussion Human body craves for things sweet as a survival instinct, having figured out in the days of foraging that the only storage of calories available was the body itself; we never knew when we will come across another tree full of sweet hangings. From the earlier apes 10 million years ago to first bipeds 5million years ago, to first humans about 2.7 million years ago, to about 100,000 years ago when we, the Homo Sapiens, came out of East Africa, the history of evolution teaches us one lesson: while our abilities have come a long way, our capabilities remain the same—100,000 years is too small a period to change our hardware.  We are still the foraging species that lived in groups of about 100-150 and began the day looking for food and kept moving. With no storage facilities, the only thing we could do was to gorge ourselves, for we never knew when we would catch the next deer or come across a fig tree. Our ability to domesticate crops changed the game rapidly—suddenly we can now go to the next Marianos and fill up our shopping cart with a million calories for a small investment. But our hardware remains the same, evoking survival response. And that explains why we crave for things sweet—it is hard-wired into us. And that goes for the rest of our eating habits as well. So far, I have said things that you have heard enough times and wondering, there goes another pundit. I will not tell you that you can cheat your survival instincts, but I can surely tell you on my personal experience that abilities have flourished over millions of years because we have built-in capabilities to learn how to fool us. Instead of fighting the sweet tooth syndrome, feed it with worthless sweets—sugar-free Jello (10 calories), pieces of water melon, often a small piece of ice cream (yes, that is mostly fat), and when in need licking Splenda. And if that does not work, don’t despair you are only a couple of years away when you will not drool at the freshly baked chocolate chip cookie. Our Recent Posts Hubri July 2, 2024 Letter to Kamala Harris July 2, 2024 Share it On:Copyright 2024 © Niazi | Developed By AssaptR

Blog

On the Death of Moore’s Law

Home About Blogs Enlightenment Inventions Photography Press Writings Contact X On the Death of Moore’s Law By Niazi April 21st, 2024 0 Comments Discussion Moore’s law refers to an observation made by Intel co-founder Gordon Moore in 1965. He noticed that the number of transistors per square inch on integrated circuits had doubled every year since their invention. The prediction repeated so well for so many years that we began calling it a law, an immutable set of observations. We have now reached a limit to compression and transistor is replaced with nanotubes or biomolecules if we are to compress more circuits in a small space. The death of Moore’s Law teaches us the wisdom of creating a new modeling base, which I am calling Discontinuous Exponential Evolution Model or DEEM.  Technology evolution has a way to be discontinuous as technology growth can affect its growth. The rate constant in an exponential equation, taken to be a constant in Moore’s law, itself is subject to change causing a discontinuous change, both as a spike or as a nadir. In the 1960s, as President Kennedy declared war on space to take us to the moon before the decade was over lead to billions of dollars invested in technology, the benefits of which are evident in just about every technology around us.  The discovery of DNA fifty years ago altered the course of biotechnology. These were spike events, and the outcome growth of technology could not have been predicted based on the trend of growth in the past, a standard measure to develop models. Some name this change as double exponential growth, but that assumes that the variation in the rate constant is cumulative. The fact is that that rate constant can be additive or deductive creating a discontinuous evolution. The feedback circuit to the speed of change is not model-able because it cannot predict the course of technology growth in areas unpredicted. For example, new tools created by technology can create new technology that can produce better tools, significantly affecting the growth cycle. The DEEM is better appreciated as we examine our evolution, coming out of East Africa about 100,000 years ago, as one of the many species of Homo genus. A sudden growth of our cerebral cortex allowed us to surpass other Homo species that had languished for over 2 million years. The superiority of this new Homo species, which we audaciously called sapiens, is is an example of spiking in the DEEM. While we can claim to know the arrival of spikes or nadirs, this claim negates that predictability model of certainty—we cannot. Figure 1 shows a comparison of three situations. The bottom line is a normal exponential growth, in this case, 10% per year, the middle line is a double exponential model where the rate itself is changing by 10%, so it is 10.1%, and in the second year it will be 10.1%, and so on. The top curve is the DEEM curve where the rate is increasing by 10% per year (double exponential model) except for an additional bump every five years by an additional 10%. The DEEM model is a more realistic view of the change coming in technology. It explains that we may reach a stage where a continuous growth can skyrocket as we see these bumps. If Moore’s Law is dead, we know that a spike is coming. However, we have learned is that we should not be too quick in declaring an observation a law, perhaps a hypothesis. Figure 1. Multiples as a function of the time unit. Comparison of exponential, double exponential and double exponential plus periodic spikes.   Our Recent Posts Hubri July 2, 2024 Letter to Kamala Harris July 2, 2024 Share it On:Copyright 2024 © Niazi | Developed By AssaptR

Blog

On Being An Invetor

Home About Blogs Enlightenment Inventions Photography Press Writings Contact X On the Death Of Moores Law By Niazi April 21st, 2024 0 Comments Discussion With around 100 inventions, I can talk about the experience of being an inventor. The simile of light bulb for an idea is not far fetched. Owning patents in a variety of fields, some farthest away from my professional field, is the fun part of inventing. An invention is: A “process” is defined as a process, act, or method, of doing or making something, and primarily includes industrial or technical processes. A “machine” would be anything that would commonly be considered such, from a clockwork to a tractor to a computer. The term “manufacture” refers to articles which are made and includes all manufactured articles. A “composition of matter” is a chemical composition, and may include mixtures of ingredients as well as new chemical compounds. There are three tests to know if a flash of an idea (what lawyers call the subject matter) can become an invention:  Is it patentable? Some inventions like the laws of nature, physical phenomena, and abstract ideas are not patentable subject matter. It includes a discovery, scientific theory or mathematical method, an aesthetic creation, a scheme, rule or method for performing a mental act, playing a game or doing business, or a computer program, a presentation of information, a procedure for surgical or therapeutic treatment, or diagnosis, to be practiced on humans or animals. A new mineral discovered in the earth or a new plant found in the wild is not patentable subject matter. Likewise, Einstein could not patent his celebrated E=mc2; nor could Newton have patented the law of gravity. Such discoveries are manifestations of nature, free to all people and reserved exclusively to none. The Atomic Energy Act of 1954 excludes the patenting of inventions useful solely in the utilization of special nuclear material or atomic energy for atomic weapons. In the case of mixtures of ingredients, such as medicines, a patent is not granted unless the mixture is more than the effect of its components. It is of interest to note that so-called “patent medicines” are not patented; the phrase “patent medicine” in this connection does not mean that the medicine has a patent. Is it novel? Has anyone mentioned the invention anywhere? Novelty means other patents, anything on the Internet and a thesis hiding in a dusty library thousands of miles away. I get amazed when I check out a new idea on the Internet and find that I am the last one to think of it. It is humbling. Crafting a question on the Internet requires a little bit of practice but not very difficult to practice. 90% of ideas fall on this requirement.  Is it non-obvious? Would someone hold an ordinary qualification in the field of the invention has thought of it? Obviousness is the hardest hurdle to overcome. In today’s advanced science, one with ordinary qualification has a Ph.D. and years of experience, so how would you know if he or she is not capable of thinking what you are thinking. Most patents are denied on this ground once they are determined to be novel. The obviousness determination goes farther to inherence, which means that even though the specific applications of an idea may not have been spelled out but if the function is present in something already reported, you have no invention.  Is it useful? “Useful” refers to the condition that the subject matter has a useful purpose and that also includes functionality, i.e., a machine which does not operate to perform the intended purpose would not be called useful, and therefore would not be granted patent protection. One can find a use for just about anything under the sun, so this remains the least of the hurdles. Once put to above tests, it becomes evident why an invention is a fleeting dream. Being an inventor does not mean that you are a genius or that you understand the science better—it only tells that you were the first ones to put the nuts and bolts together for a useful idea (subject matter). Socratic method, also known as a method of elenchus, elenctic method, or Socratic debate, is named after the classical Greek philosopher Socrates. It is a form of inquiry and discussion between individuals, based on asking and answering questions to stimulate critical thinking and to illuminate ideas. When you are in the inventing mode, the conversation takes place with yourself. You keep asking questions until you hit a wall. It is this point when you come up with an alternative proposition. Let me share with you a few examples of my personal experience. Bioreactors or fermenters are used to brew beer, making recombinant drugs and grow organs. We are using them for thousands of years, but the first US patent for a bioreactor was issued on 1 April 1842 to C. C. Edday (US Patent 2,535), titled Fermenting Vat. Today, 170 years later, we still use an upright vat, a stirring propeller, a sugar solution, a biological entity to produce all products, from wine, beer, drugs to organs. At a commercial level, these vats are tall, sometimes two and three stories tall, but in most cases, require a tall ceiling to operate.  When I started my company to manufacture recombinant drugs using bacteria and mammalian cells, I ran into a wall when I found that the manufacturing facility that was made available to me, thanks to a kind investor, had an 8-ft ceiling height. The choice was constraining enough to prevent big investors to help me out. After a few weeks of pondering, I came up with an idea of laying the vat horizontally, instead of vertically. There is no limitation on the size of the bioreactor because of its height when it lays horizontally.  Combining this idea with a single-use bioreactor, a replacement of stirring mechanism and many other variations that were not possible in an upright vat, I could secure dozens of patents. It turned out that the new configuration of

Blog

Amorphous Scalia and The New USA

Home About Blogs Enlightenment Inventions Photography Press Writings Contact X Amorphous Scalia and The New USA By Niazi April 21st, 2024 0 Comments Discussion The death of Justice Antonin Scalia in 2016 reminded me of his vote in 1986 for the Louisiana law that forbade public schools to teach evolution without also instructing students also on “creative science.” Then the Chief Justice William Rehnquist joined Scalia, who was heavily criticized by the academicians, as Jay Gould of Harvard said, “I regret to say that Justice Scalia does not understand the subject matter of evolutionary biology.” Scalia was open about the reasons for his decision. He said, “what we can really know for sure?”, criticizing his colleagues’ decision for treating the evidence for evolution as “conclusive.” To Scalia, ‘creative science’ was indeed a science as well.  It was a constitutional issue of church and state to other justices, but to Scalia, it was a matter of belief, not necessarily in creation but the proof against creation science. In the 17th century, Galileo was ordered to turn himself into the Holy Office to begin trial for holding the belief that the Earth revolves around the Sun, which was deemed heretical by the Catholic Church. In 1616, exactly 300 years ago, the Inquisition found heliocentrism to be formally heretical, and heliocentric books were banned, and Galileo was ordered to refrain from holding, teaching or defending heliocentric ideas.  More than 350 years after the Roman Catholic Church condemned Galileo, Pope John Paul II rectified this Church’s most infamous wrongs. Three centuries from today, we will look back and find a member of the Supreme Court just as primitive as was the Church six hundred years ago. But with a caveat. The judiciary faces judging cases based on merit and beyond a reasonable doubt. This task becomes onerous when the judgment involves amorphous considerations. There is no sharp line in judging when a decision resides on values, morality, and more particularly on the complexity of the issue that is beyond the capability the of a judge to understand.  Scalia admitted, he does not get the science and has moral issues with his beliefs regarding Creation. What folks three hundred years from now will find is that there was a drastic change in the US values as the judges rendered a plurality of decisions that on amorphous grounds that changed our society for all time to come. It was too late. Our Recent Posts Hubri July 2, 2024 Letter to Kamala Harris July 2, 2024 Share it On:Copyright 2024 © Niazi | Developed By AssaptR

Blog

Why Do We Exist?

Home About Blogs Enlightenment Inventions Photography Press Writings Contact X Why Do We Exist? By Niazi April 21st, 2024 0 Comments Discussion About 80,000 years ago, a variant of the Homo genus, the Sapiens, came out of East Africa and spread across the globe, annihilating other Homo genus title holders, many large animal species and now finally devouring the environment that supports his existence. The renewal of species is a common occurrence in the mega plan of evolution and tomorrow’s Homo genus species will be very different from us, but today, we are not able to predict, how? The span of 80,000 years is not enough to produce any significant changes in our genetic code, one of which made us curious and inquisitive—perhaps as a means of protecting us against the unknown—the survival of our species depended on it. Some day when we will not fear walking in a dark room, we would have overcome this genetic coding, not now. Our curiosity and inquisition were endless 80,000 years ago, and it remains so today, except we have a better vocabulary to frame our questions. The first question that came to the mind of the foraging Sapiens was: “How do we survive?” We had not yet domesticated crops or bred animals for food and lived off whatever came in our path as we moved around, mostly in groups of less than 100, for anything bigger than that caused a split and rise of another group that was not likely to be friendly to us. While our toolmaking skills were superior to Neanderthal and Erectus species, we excelled in organizing our groups. We realized early in our foraging times that some of us better at doing one task over others and that gave rise to what we call today, professions. Now, as tasks were assigned, it became difficult for us to leave the group as we became dependent on others in the group for our survival, and thus grew societies and kingdoms. Those who were good at ruling, found this to be a great profession, and thus came dictators, pharaohs, prophets, kings, and bigots to rise. You can now appreciate, how one question asked resulted in a creation of civilization. Another issue that came way after we had assured our survival was “Where are we?” We could see the stars around us and had wanted to know our place in the arena of whatever was visible to us. Our condescending nature led us to believe that we are the focus of the Universe. Historically, the center of the Universe had been thought to be several locations. Many mythological and religious cosmologies included an Axis Mundi, the central axis of a flat Earth that connects the Earth, heavens, and other realms together. In the 4th century BCE Greece, the geocentric model was developed based on astronomical observation, proposing that the center of the Universe lies at the center of a spherical, stationary Earth, around which the sun, moon, planets, and stars rotate. With the development of the heliocentric model by Nicolaus Copernicus in the 16th century, the sun was believed to be the center of the Universe, with the planets (including Earth) and stars orbiting it. In the early 20th century, the discovery of other galaxies and the development of the Big Bang theory led to the development of cosmological models of a homogeneous, isotropic Universe (which lacks a central point) that is expanding at all points. The reason why we resisted accepting that we are not the center of the Universe comes from realizing how small and insignificant we are? Narcissism bred into our genes has not yet left our construction. The question, “Who are we?” created great tumult in human society because of its amorphous nature. The early foraging Sapiens are desiring to grow their community as produced con artists who sold gods and what better way to connect than by assuring that we are indeed the chosen people. The story caught on well, and every religion claims that they are the righteous ones, or else why would anyone believe in it? Today, most of us believe we are a creation of God, of one type or another kind of God. Moreover, despite the indisputable theory of evolution as proposed by Mr. Darwin in the mid-19th century. Evolution is a change in the heritable characteristics of biological populations over successive generations. Evolutionary processes give rise to biodiversity at every level of biological organization, including the levels of species, individual organisms, and molecules. All life on Earth shares a common ancestor known as the last universal common ancestor (LUCA), which lived approximately 4.1 billion years ago. Repeated formation of new species, change within species, and loss of species or extinction throughout the evolutionary history of life on Earth are demonstrated by shared sets of morphological and biochemical traits, including shared DNA sequences. [We still carry a few genetic sequences of the Neanderthals}. More than 99 percent of all species that ever lived on Earth are estimated to be extinct, and estimates of Earth’s current species range from 10 to 14 million. Primates diverged from other mammals about 85 million years ago, and the Hominini tribe (humans, Australopithecines and another extinct biped genre, and chimpanzees) parted from the Gorillini tribe (gorillas) some 8-9 million years ago, and a couple of million years later, we further separated into more refined humans and biped ancestors. The creation–evolution controversy is an ongoing, recurring cultural, political, and theological dispute about the origins of the Earth, of humanity, and of other life. Within the Christian world (not the fundamentalists) evolution by natural selection has been established as an empirical scientific fact stating that “evolution requires the creation of beings that evolve.” Ironically, the rules of genetic evolutionary inheritance were first discovered by a Catholic priest, the Augustinian monk Gregor Mendel, who is known today as the founder of modern genetics. Most other religions totally deny evolution, as it threatens their very foundation. While the question, “who are we?” remains disputed, it has lost much of its

Blog

Thermodynamic Equivalence to Demonstrate Bioequivalence

Home About Blogs Enlightenment Inventions Photography Press Writings Contact X Thermodynamic Equivalence to Demonstrate Bioequivalence By Niazi April 21st, 2024 0 Comments Discussion The 1980s saw emergence of generic drugs in the US that has saved hundreds of billions of dollars to patients and improved accessibility to drugs. The statute that created this abbreviated pathway for approval of generic drugs stated that therapeutic equivalence means same concentration of active drug at the site of action, an evaluation that was neither possible nor practical. So, the FDA recommended using blood level studies as a surrogate test demonstrating bioequivalence. Later, FDA agreed to remove this requirement, the biowaiver, for drugs that are highly soluble. I am now introducing a new concept–thermodynamic equivalence, in lieu of bioequivalence testing.  Thermodynamic equivalence (TE) is by another name, the “basis” for biowaivers, in place for years. For a highly soluble drug, the barrier Delta G is small, overcoming any differences between two products. I am expanding this concept to drugs subject to blood level studies. Why would a drug product fail in BE, when it has the same chemical entity? It is inevitably the release profile at the site of delivery, since this point forward, all factors apply equally. Dissolution rate testing is the best example of measuring chemical potential and while it works well for products with small DG, it fails for drugs that are not released instantly. Creating a matrix of dissolution profiles, independent of any physiologic conditions, such as a 3×3 matrix, may be able to discern the differences not picked up by current dissolution testing. This is not a theoretical suggestion, it  is already in practice, such as in the comparison of biologics, where a different approach to matching CQAs allows FDA to approve them without requiring phase 3 studies. The industry should attempt now to use this concept to request biowaivers, particularly for highly complex product designs to reduce cost and time to market. It was after years of similar discussion that the FDA agreed to look into the concept of TE that can be continually used to assure life-cycle therapeutic equivalence. The FDA has now opened up this discussion agreeing that the concept needs to be explored further.  http://www.prnewswire.com/news-releases/pharmaceutical-scientist-inc-fda-calls-for-public-comments-on-bioequivalence-testing-300489368.html?tc=eml_cleartime From writing the suggestions for the first guidance for bioequivalence to biosimilarity testing, I have been engaged with FDA and I am now confident that we are entering a phase of scientific reality that can bring another phase of reduced burden on development of drugs. Our Recent Posts Hubri July 2, 2024 Letter to Kamala Harris July 2, 2024 Share it On:Copyright 2024 © Niazi | Developed By AssaptR

Blog

A Biosimilar Delayed is a Biosimilar Denied

Home About Blogs Enlightenment Inventions Photography Press Writings Contact X A Biosimilar Delayed is Biosimilar Denied By Niazi April 21st, 2024 0 Comments Discussion After decades of experience in taking biosimilars from cell line to market and after facing both success and failures, I feel qualified to talk about what not to do when it comes to making biosimilars accessible (available and affordable). I am inviting all of my friends (and I have many) to join me at this conference where I provide a step-by-step approach to succeeding in securing FDA approval of biosimilars. This is an experience based on a first-hand working knowledge of how the FDA lays out its expectations. I have been instrumental in contributing to FDA guidelines and detailed treatises on the science and technology of biosimilar approvals under 351k and 505b2 provisions. Our Recent Posts Hubri July 2, 2024 Letter to Kamala Harris July 2, 2024 Share it On:Copyright 2024 © Niazi | Developed By AssaptR

Blog

Making Alcoholic Beverages Unique and Affordable – New Patent Issued 29 August 2017.

Home About Blogs Enlightenment Inventions Photography Press Writings Contact X Making Alcoholic Beverages Unique and Affordable – New Patent Issued 29 August 2017 By Niazi April 21st, 2024 0 Comments Discussion The alcoholic beverage industry is bigger than the pharmaceutical industry, over $1.5 Trillion worldwide, yet the technology for aging alcohol, letting it sit in a barrel, dates back thousands of years. While I have developed many technologies with dozens of patents to make biological drugs more affordable, and these technologies are used to reduce the COGs of biosimilars, now I am reporting my new invention for continuous aging of alcoholic beverages, eliminating wood barrels and forcing the natural process by thousands of times by manipulating the thermodynamics of the Fick’s law of diffusion. Now for the first time, we can introduce proprietary tastes using several types of wood simultaneously and produce the desire product at a much lower cost. It is about time that we become more creative with what is perhaps the largest product in the world. Yes, we should make drugs affordable, then why not the beverage that makes billions happy ever day. This patent is one of about a dozen patents on fast aging of alcoholic beverages including wines, whiskies and others, allow storage in multiple-use bottles without affecting the quality of content and many more inventions to change a technology that has seen little change over 7000 years. Our Recent Posts Hubri July 2, 2024 Letter to Kamala Harris July 2, 2024 Share it On:Copyright 2024 © Niazi | Developed By AssaptR

Blog

Fingerprint-Like Non-Inferiority Similarity Demonstration

Home About Blogs Enlightenment Inventions Photography Press Writings Contact X Fingerprint-Like Non-Inferiority Similarity Demonstration – A New Invention By Niazi April 21st, 2024 0 Comments Discussion The FDA, now now EMA, emphasize analytical similarity as the pivotal step to minimize residual uncertainty, but creating a plan to demonstrate fingerprint-like similarity has been difficult because of improper use of testing methods, testing protocol, statistical modeling and the knowledge of the variability within and between lots. I have addressed this issue in my several books including the Biosimilars and Interchangeable Biologics: Tactical Issues (https://www.niazi.com/handbooks-and-technical-books) but now I have invented a new method that makes fingerprint-like similarity testing possible. This should help bringing biosimilars faster to market by assuring least residual uncertainty in the early phases of development. What will make biosimilars accessible (available and affordable) is a mindset that supports FDA thinking on scientific approaches that the Agency has demonstrated repeatedly; the ball is in the court of developers to give FDA reasons to approve biosimilars without lengthy, expensive clinical trials. I will be happy to provide more information and details to anyone wishing to examine their approach.  While the concept of non-inferiority testing has been applied to clinical trials, this concept is yet to be applied to analytical similarity testing; the testing protocols as shared by FDA in reviewing three biosimilars, filgrastim, insulin glargine and bevacizumab show a more conventional approach to demonstrate similarity, not non-inferiority. The CQAs for the latter are different than those used of demonstrating similarity.  My inventions include methods to demonstrate difference through a process of thermodynamic extrapolation. I have shared the novelty aspect as well as the non-inferiority trial designs with regulatory agencies and they have all encouraged this approach. I will soon be filing a CP to enforce these new technology and the proposed approach. Being able to demonstrate non-inferiority is a double-edge sword; biosimilar developers can use it to show fingerprint-like similarity but the LBP companies can use these methods to show lack of similarity as well, whether it is clinically meaningful or not. I am strong that urging the scientific community in the biologics arena to come up with better tests to allow FDA to approve products without requiring phase 3 studies in those cases where these studies can take long time to complete and add to the cost of biosimilar development. Our Recent Posts Hubri July 2, 2024 Letter to Kamala Harris July 2, 2024 Share it On:Copyright 2024 © Niazi | Developed By AssaptR

Scroll to Top