Entry Level Jobs In Pharmaceutical Companies

Entry-level jobs in pharmaceutical companies are a good way to start your career in the pharmaceutical industry, especially if you are interested in pursuing a career as a pharmacist or chemist. With an entry-level job, you will have the opportunity to learn more about the industry and gain valuable experience that can help you move up within the company.

Entry-level jobs may offer training and benefits packages. The benefits package may include health insurance, retirement plans, vacation time and sick leave. Some companies offer tuition reimbursement programs for employees who want to attend college classes part time or full time.

Entry-level jobs in pharmaceutical companies usually require a high school diploma or equivalent degree. Most employers prefer applicants who have completed some college courses or have taken courses at a community college or technical school related to chemistry, biology or pharmacy science fields. Many employers also prefer applicants who have completed some medical terminology courses or other classes related to medical terminology such as anatomy and physiology which can help them understand how medicine works on a molecular level when developing new medications for patients suffering from various diseases like cancer or heart disease among others.”

Entry Level Jobs In Pharmaceutical Companies

The pharmaceutical industry discovers, develops, produces, and markets drugs or pharmaceutical drugs for use as medications to be administered to patients (or self-administered), with the aim to cure them, vaccinate them, or alleviate symptoms.[1][2] Pharmaceutical companies may deal in generic or brand medications and medical devices. They are subject to a variety of laws and regulations that govern the patenting, testing, safety, efficacy using drug testing and marketing of drugs. The global pharmaceuticals market produced treatments worth $1,228.45 billion in 2020 and showed a compound annual growth rate (CAGR) of 1.8%.[3]

Contents
1 History
1.1 Mid-1800s – 1945: From botanicals to the first synthetic drugs
1.1.1 Epinephrine, norepinephrine, and amphetamine
1.1.2 Discovery and development of the barbiturates
1.1.3 Insulin
1.1.4 Early anti-infective research: Salvarsan, Prontosil, Penicillin and vaccines
1.1.5 Unsafe drugs and early industry regulation
1.2 The post-war years, 1945–1970
1.2.1 Further advances in anti-infective research
1.2.2 Development and marketing of antihypertensive drugs
1.2.3 Oral Contraceptives
1.2.4 Thalidomide and the Kefauver-Harris Amendments
1.3 1970–1980s
1.3.1 Statins
2 Research and development
2.1 The cost of innovation
3 Product approval
3.1 Orphan drugs
4 Global sales
4.1 Patents and generics
4.2 Prescriptions
5 Marketing
5.1 To healthcare professionals
5.2 Direct to consumer advertising
6 Controversies
6.1 Drug marketing and lobbying
6.2 Medication pricing
6.3 Regulatory issues
6.4 Pharmaceutical fraud
6.5 Physician roles
6.6 Response to COVID-19
7 Developing world
7.1 Patents
7.2 Charitable programs
8 See also
9 References
10 External links
History
Main article: History of pharmacy
Mid-1800s – 1945: From botanicals to the first synthetic drugs
The modern era of pharmaceutical industry began with local apothecaries that expanded from their traditional role of distributing botanical drugs such as morphine and quinine to wholesale manufacture in the mid-1800s, and from discoveries resulting from applied research. Intentional drug discovery from plants began with the isolation between 1803 and 1805 of morphine – an analgesic and sleep-inducing agent – from opium by the German apothecary assistant Friedrich Sertürner, who named this compound after the Greek god of dreams, Morpheus.[4] By the late 1880s, German dye manufacturers had perfected the purification of individual organic compounds from tar and other mineral sources and had also established rudimentary methods in organic chemical synthesis.[5] The development of synthetic chemical methods allowed scientists to systematically vary the structure of chemical substances, and growth in the emerging science of pharmacology expanded their ability to evaluate the biological effects of these structural changes.

Epinephrine, norepinephrine, and amphetamine
By the 1890s, the profound effect of adrenal extracts on many different tissue types had been discovered, setting off a search both for the mechanism of chemical signalling and efforts to exploit these observations for the development of new drugs. The blood pressure raising and vasoconstrictive effects of adrenal extracts were of particular interest to surgeons as hemostatic agents and as treatment for shock, and a number of companies developed products based on adrenal extracts containing varying purities of the active substance. In 1897, John Abel of Johns Hopkins University identified the active principle as epinephrine, which he isolated in an impure state as the sulfate salt. Industrial chemist Jōkichi Takamine later developed a method for obtaining epinephrine in a pure state, and licensed the technology to Parke-Davis. Parke-Davis marketed epinephrine under the trade name Adrenalin. Injected epinephrine proved to be especially efficacious for the acute treatment of asthma attacks, and an inhaled version was sold in the United States until 2011 (Primatene Mist).[6][7] By 1929 epinephrine had been formulated into an inhaler for use in the treatment of nasal congestion.

While highly effective, the requirement for injection limited the use of epinephrine[clarification needed] and orally active derivatives were sought. A structurally similar compound, ephedrine, (actually more similar to norepinephrine,) was identified by Japanese chemists in the Ma Huang plant and marketed by Eli Lilly as an oral treatment for asthma. Following the work of Henry Dale and George Barger at Burroughs-Wellcome, academic chemist Gordon Alles synthesized amphetamine and tested it in asthma patients in 1929. The drug proved to have only modest anti-asthma effects but produced sensations of exhilaration and palpitations. Amphetamine was developed by Smith, Kline and French as a nasal decongestant under the trade name Benzedrine Inhaler. Amphetamine was eventually developed for the treatment of narcolepsy, post-encephalitic parkinsonism, and mood elevation in depression and other psychiatric indications. It received approval as a New and Nonofficial Remedy from the American Medical Association for these uses in 1937[8] and remained in common use for depression until the development of tricyclic antidepressants in the 1960s.[7]

Discovery and development of the barbiturates

Diethylbarbituric acid was the first marketed barbiturate. It was sold by Bayer under the trade name Veronal
In 1903, Hermann Emil Fischer and Joseph von Mering disclosed their discovery that diethylbarbituric acid, formed from the reaction of diethylmalonic acid, phosphorus oxychloride and urea, induces sleep in dogs. The discovery was patented and licensed to Bayer pharmaceuticals, which marketed the compound under the trade name Veronal as a sleep aid beginning in 1904. Systematic investigations of the effect of structural changes on potency and duration of action led to the discovery of phenobarbital at Bayer in 1911 and the discovery of its potent anti-epileptic activity in 1912. Phenobarbital was among the most widely used drugs for the treatment of epilepsy through the 1970s, and as of 2014, remains on the World Health Organizations list of essential medications.[9][10] The 1950s and 1960s saw increased awareness of the addictive properties and abuse potential of barbiturates and amphetamines and led to increasing restrictions on their use and growing government oversight of prescribers. Today, amphetamine is largely restricted to use in the treatment of attention deficit disorder and phenobarbital in the treatment of epilepsy.[11][12]

Insulin
A series of experiments performed from the late 1800s to the early 1900s revealed that diabetes is caused by the absence of a substance normally produced by the pancreas. In 1869, Oskar Minkowski and Joseph von Mering found that diabetes could be induced in dogs by surgical removal of the pancreas. In 1921, Canadian professor Frederick Banting and his student Charles Best repeated this study and found that injections of pancreatic extract reversed the symptoms produced by pancreas removal. Soon, the extract was demonstrated to work in people, but development of insulin therapy as a routine medical procedure was delayed by difficulties in producing the material in sufficient quantity and with reproducible purity. The researchers sought assistance from industrial collaborators at Eli Lilly and Co. based on the company’s experience with large scale purification of biological materials. Chemist George B. Walden of Eli Lilly and Company found that careful adjustment of the pH of the extract allowed a relatively pure grade of insulin to be produced. Under pressure from Toronto University and a potential patent challenge by academic scientists who had independently developed a similar purification method, an agreement was reached for non-exclusive production of insulin by multiple companies. Prior to the discovery and widespread availability of insulin therapy the life expectancy of diabetics was only a few months.[13]

Early anti-infective research: Salvarsan, Prontosil, Penicillin and vaccines
The development of drugs for the treatment of infectious diseases was a major focus of early research and development efforts; in 1900,pneumonia, tuberculosis, and diarrhea were the three leading causes of death in the United States and mortality in the first year of life exceeded 10%.[14][15]

In 1911 arsphenamine, the first synthetic anti-infective drug, was developed by Paul Ehrlich and chemist Alfred Bertheim of the Institute of Experimental Therapy in Berlin. The drug was given the commercial name Salvarsan.[16] Ehrlich, noting both the general toxicity of arsenic and the selective absorption of certain dyes by bacteria, hypothesized that an arsenic-containing dye with similar selective absorption properties could be used to treat bacterial infections. Arsphenamine was prepared as part of a campaign to synthesize a series of such compounds and found to exhibit partially selective toxicity. Arsphenamine proved to be the first effective treatment for syphilis, a disease which prior to that time was incurable and led inexorably to severe skin ulceration, neurological damage, and death.[17]

Ehrlich’s approach of systematically varying the chemical structure of synthetic compounds and measuring the effects of these changes on biological activity was pursued broadly by industrial scientists, including Bayer scientists Josef Klarer, Fritz Mietzsch, and Gerhard Domagk. This work, also based in the testing of compounds available from the German dye industry, led to the development of Prontosil, the first representative of the sulfonamide class of antibiotics. Compared to arsphenamine, the sulfonamides had a broader spectrum of activity and were far less toxic, rendering them useful for infections caused by pathogens such as streptococci.[18] In 1939, Domagk received the Nobel Prize in Medicine for this discovery.[19][20] Nonetheless, the dramatic decrease in deaths from infectious diseases that occurred prior to World War II was primarily the result of improved public health measures such as clean water and less crowded housing, and the impact of anti-infective drugs and vaccines was significant mainly after World War II.[21][22]

In 1928, Alexander Fleming discovered the antibacterial effects of penicillin, but its exploitation for the treatment of human disease awaited the development of methods for its large scale production and purification. These were developed by a U.S. and British government-led consortium of pharmaceutical companies during the Second World War.[23]

Early progress toward the development of vaccines occurred throughout this period, primarily in the form of academic and government-funded basic research directed toward the identification of the pathogens responsible for common communicable diseases. In 1885, Louis Pasteur and Pierre Paul Émile Roux created the first rabies vaccine. The first diphtheria vaccines were produced in 1914 from a mixture of diphtheria toxin and antitoxin (produced from the serum of an inoculated animal), but the safety of the inoculation was marginal and it was not widely used. The United States recorded 206,000 cases of diphtheria in 1921 resulting in 15,520 deaths. In 1923, parallel efforts by Gaston Ramon at the Pasteur Institute and Alexander Glenny at the Wellcome Research Laboratories (later part of GlaxoSmithKline) led to the discovery that a safer vaccine could be produced by treating diphtheria toxin with formaldehyde.[24] In 1944, Maurice Hilleman of Squibb Pharmaceuticals developed the first vaccine against Japanese Encephalitis.[25] Hilleman would later move to Merck where he would play a key role in the development of vaccines against measles, mumps, chickenpox, rubella, hepatitis A, hepatitis B, and meningitis.

Unsafe drugs and early industry regulation

In 1937 over 100 people died after ingesting a solution of the antibacterial sulfanilamide formulated in the toxic solvent diethylene glycol
Prior to the 20th century, drugs were generally produced by small scale manufacturers with little regulatory control over manufacturing or claims of safety and efficacy. To the extent that such laws did exist, enforcement was lax. In the United States, increased regulation of vaccines and other biological drugs was spurred by tetanus outbreaks and deaths caused by the distribution of contaminated smallpox vaccine and diphtheria antitoxin.[26] The Biologics Control Act of 1902 required that federal government grant premarket approval for every biological drug and for the process and facility producing such drugs. This was followed in 1906 by the Pure Food and Drugs Act, which forbade the interstate distribution of adulterated or misbranded foods and drugs. A drug was considered misbranded if it contained alcohol, morphine, opium, cocaine, or any of several other potentially dangerous or addictive drugs, and if its label failed to indicate the quantity or proportion of such drugs. The government’s attempts to use the law to prosecute manufacturers for making unsupported claims of efficacy were undercut by a Supreme Court ruling restricting the federal government’s enforcement powers to cases of incorrect specification of the drug’s ingredients.[27]

In 1937 over 100 people died after ingesting “Elixir Sulfanilamide” manufactured by S.E. Massengill Company of Tennessee. The product was formulated in diethylene glycol, a highly toxic solvent that is now widely used as antifreeze.[28] Under the laws extant at that time, prosecution of the manufacturer was possible only under the technicality that the product had been called an “elixir”, which literally implied a solution in ethanol. In response to this episode, the U.S. Congress passed the Federal Food, Drug, and Cosmetic Act of 1938, which for the first time required pre-market demonstration of safety before a drug could be sold, and explicitly prohibited false therapeutic claims.[29]

The post-war years, 1945–1970
Further advances in anti-infective research
The aftermath of World War II saw an explosion in the discovery of new classes of antibacterial drugs[30] including the cephalosporins (developed by Eli Lilly based on the seminal work of Giuseppe Brotzu and Edward Abraham),[31][32] streptomycin (discovered during a Merck-funded research program in Selman Waksman’s laboratory[33]), the tetracyclines[34] (discovered at Lederle Laboratories, now a part of Pfizer), erythromycin (discovered at Eli Lilly and Co.)[35] and their extension to an increasingly wide range of bacterial pathogens. Streptomycin, discovered during a Merck-funded research program in Selman Waksman’s laboratory at Rutgers in 1943, became the first effective treatment for tuberculosis. At the time of its discovery, sanitoriums for the isolation of tuberculosis-infected people were an ubiquitous feature of cities in developed countries, with 50% dying within 5 years of admission.[33][36]

A Federal Trade Commission report issued in 1958 attempted to quantify the effect of antibiotic development on American public health. The report found that over the period 1946–1955, there was a 42% drop in the incidence of diseases for which antibiotics were effective and only a 20% drop in those for which antibiotics were not effective. The report concluded that “it appears that the use of antibiotics, early diagnosis, and other factors have limited the epidemic spread and thus the number of these diseases which have occurred”. The study further examined mortality rates for eight common diseases for which antibiotics offered effective therapy (syphilis, tuberculosis, dysentery, scarlet fever, whooping cough, meningococcal infections, and pneumonia), and found a 56% decline over the same period.[37] Notable among these was a 75% decline in deaths due to tuberculosis.[38]

Measles cases 1938-1964 follow a highly variable epidemic pattern, with 150,000-850,000 cases per year. A sharp decline followed introduction of the vaccine in 1963, with fewer than 25,000 cases reported in 1968. Outbreaks around 1971 and 1977 gave 75,000 and 57,000 cases, respectively. Cases were stable at a few thousand per year until an outbreak of 28,000 in 1990. Cases declined from a few hundred per year in the early 1990s to a few dozen in the 2000s.
Measles cases reported in the United States before and after introduction of the vaccine.
Life expectancy by age in 1900, 1950, and 1997 United States.
Percent surviving by age in 1900, 1950, and 1997.[14]
During the years 1940–1955, the rate of decline in the U.S. death rate accelerated from 2% per year to 8% per year, then returned to the historical rate of 2% per year. The dramatic decline in the immediate post-war years has been attributed to the rapid development of new treatments and vaccines for infectious disease that occurred during these years.[21][22] Vaccine development continued to accelerate, with the most notable achievement of the period being Jonas Salk’s 1954 development of the polio vaccine under the funding of the non-profit National Foundation for Infantile Paralysis. The vaccine process was never patented but was instead given to pharmaceutical companies to manufacture as a low-cost generic. In 1960 Maurice Hilleman of Merck Sharp & Dohme identified the SV40 virus, which was later shown to cause tumors in many mammalian species. It was later determined that SV40 was present as a contaminant in polio vaccine lots that had been administered to 90% of the children in the United States.[39][40] The contamination appears to have originated both in the original cell stock and in monkey tissue used for production. In 2004 the United States Cancer Institute announced that it had concluded that SV40 is not associated with cancer in people.[41]

Other notable new vaccines of the period include those for measles (1962, John Franklin Enders of Children’s Medical Center Boston, later refined by Maurice Hilleman at Merck), Rubella (1969, Hilleman, Merck) and mumps (1967, Hilleman, Merck)[42] The United States incidences of rubella, congenital rubella syndrome, measles, and mumps all fell by >95% in the immediate aftermath of widespread vaccination.[43] The first 20 years of licensed measles vaccination in the U.S. prevented an estimated 52 million cases of the disease, 17,400 cases of mental retardation, and 5,200 deaths.[44]

Development and marketing of antihypertensive drugs
Hypertension is a risk factor for atherosclerosis,[45] heart failure,[46] coronary artery disease,[47][48] stroke,[49] renal disease,[50][51] and peripheral arterial disease,[52][53] and is the most important risk factor for cardiovascular morbidity and mortality, in industrialized countries.[54] Prior to 1940 approximately 23% of all deaths among persons over age 50 were attributed to hypertension. Severe cases of hypertension were treated by surgery.[55]

Early developments in the field of treating hypertension included quaternary ammonium ion sympathetic nervous system blocking agents, but these compounds were never widely used due to their severe side effects, because the long-term health consequences of high blood pressure had not yet been established, and because they had to be administered by injection.

In 1952 researchers at Ciba discovered the first orally available vasodilator, hydralazine.[56] A major shortcoming of hydralazine monotherapy was that it lost its effectiveness over time (tachyphylaxis). In the mid-1950s Karl H. Beyer, James M. Sprague, John E. Baer, and Frederick C. Novello of Merck and Co. discovered and developed chlorothiazide, which remains the most widely used antihypertensive drug today.[57] This development was associated with a substantial decline in the mortality rate among people with hypertension.[58] The inventors were recognized by a Public Health Lasker Award in 1975 for “the saving of untold thousands of lives and the alleviation of the suffering of millions of victims of hypertension”.[59]

A 2009 Cochrane review concluded that thiazide antihypertensive drugs reduce the risk of death (RR 0.89), stroke (RR 0.63), coronary heart disease (RR 0.84), and cardiovascular events (RR 0.70) in people with high blood pressure.[60] In the ensuring years other classes of antihypertensive drug were developed and found wide acceptance in combination therapy, including loop diuretics (Lasix/furosemide, Hoechst Pharmaceuticals, 1963),[61] beta blockers (ICI Pharmaceuticals, 1964)[62] ACE inhibitors, and angiotensin receptor blockers. ACE inhibitors reduce the risk of new onset kidney disease [RR 0.71] and death [RR 0.84] in diabetic patients, irrespective of whether they have hypertension.[63]

Oral Contraceptives
Prior to the second world war, birth control was prohibited in many countries, and in the United States even the discussion of contraceptive methods sometimes led to prosecution under Comstock laws. The history of the development of oral contraceptives is thus closely tied to the birth control movement and the efforts of activists Margaret Sanger, Mary Dennett, and Emma Goldman. Based on fundamental research performed by Gregory Pincus and synthetic methods for progesterone developed by Carl Djerassi at Syntex and by Frank Colton at G.D. Searle & Co., the first oral contraceptive, Enovid, was developed by E.D. Searle and Co. and approved by the FDA in 1960. The original formulation incorporated vastly excessive doses of hormones, and caused severe side effects. Nonetheless, by 1962, 1.2 million American women were on the pill, and by 1965 the number had increased to 6.5 million.[64][65][66][67] The availability of a convenient form of temporary contraceptive led to dramatic changes in social mores including expanding the range of lifestyle options available to women, reducing the reliance of women on men for contraceptive practice, encouraging the delay of marriage, and increasing pre-marital co-habitation.[68]

Thalidomide and the Kefauver-Harris Amendments

Malformation of a baby born to a mother who had taken thalidomide while pregnant.
In the U.S., a push for revisions of the FD&C Act emerged from Congressional hearings led by Senator Estes Kefauver of Tennessee in 1959. The hearings covered a wide range of policy issues, including advertising abuses, questionable efficacy of drugs, and the need for greater regulation of the industry. While momentum for new legislation temporarily flagged under extended debate, a new tragedy emerged that underscored the need for more comprehensive regulation and provided the driving force for the passage of new laws.

On 12 September 1960, an American licensee, the William S. Merrell Company of Cincinnati, submitted a new drug application for Kevadon (thalidomide), a sedative that had been marketed in Europe since 1956. The FDA medical officer in charge of reviewing the compound, Frances Kelsey, believed that the data supporting the safety of thalidomide was incomplete. The firm continued to pressure Kelsey and the FDA to approve the application until November 1961, when the drug was pulled off the German market because of its association with grave congenital abnormalities. Several thousand newborns in Europe and elsewhere suffered the teratogenic effects of thalidomide. Without approval from the FDA, the firm distributed Kevadon to over 1,000 physicians there under the guise of investigational use. Over 20,000 Americans received thalidomide in this “study,” including 624 pregnant patients, and about 17 known newborns suffered the effects of the drug.[citation needed]

The thalidomide tragedy resurrected Kefauver’s bill to enhance drug regulation that had stalled in Congress, and the Kefauver-Harris Amendment became law on 10 October 1962. Manufacturers henceforth had to prove to FDA that their drugs were effective as well as safe before they could go on the US market. The FDA received authority to regulate advertising of prescription drugs and to establish good manufacturing practices. The law required that all drugs introduced between 1938 and 1962 had to be effective. An FDA – National Academy of Sciences collaborative study showed that nearly 40 percent of these products were not effective. A similarly comprehensive study of over-the-counter products began ten years later.[69]

1970–1980s
Statins
Main article: Discovery and development of statins
In 1971, Akira Endo, a Japanese biochemist working for the pharmaceutical company Sankyo, identified mevastatin (ML-236B), a molecule produced by the fungus Penicillium citrinum, as an inhibitor of HMG-CoA reductase, a critical enzyme used by the body to produce cholesterol. Animal trials showed very good inhibitory effect as in clinical trials, however a long-term study in dogs found toxic effects at higher doses and as a result mevastatin was believed to be too toxic for human use. Mevastatin was never marketed, because of its adverse effects of tumors, muscle deterioration, and sometimes death in laboratory dogs.

P. Roy Vagelos, chief scientist and later CEO of Merck & Co, was interested, and made several trips to Japan starting in 1975. By 1978, Merck had isolated lovastatin (mevinolin, MK803) from the fungus Aspergillus terreus, first marketed in 1987 as Mevacor.[70][71][72]

In April 1994, the results of a Merck-sponsored study, the Scandinavian Simvastatin Survival Study, were announced. Researchers tested simvastatin, later sold by Merck as Zocor, on 4,444 patients with high cholesterol and heart disease. After five years, the study concluded the patients saw a 35% reduction in their cholesterol, and their chances of dying of a heart attack were reduced by 42%.[73] In 1995, Zocor and Mevacor both made Merck over US$1 billion. Endo was awarded the 2006 Japan Prize, and the Lasker-DeBakey Clinical Medical Research Award in 2008. For his “pioneering research into a new class of molecules” for “lowering cholesterol,”[sentence fragment][74][

Leave a Reply