What’s with this GFR test? For years there we were, medical professionals, just doing our jobs when out of nowhere comes a ‘new’ test we suddenly have to become knowledgeable of. It’s like we’re the only child whose parents have come home with a brand new baby sibling that we’re suddenly suppose to instantly love and accept. We didn’t ask for it, and things were fine until it came along to cause a lot of uncomfortable questions.
Hopefully this entry will answer questions about this newest diagnostic test.
First of all, GFR stands for Glomerular Filtration Rate. The glomerulus is the structure in the kidney responsible for filtering blood, so determine how well it is working can be used to measure kidney function.
Second of all, it’s not a new test. It’s actually a number calculated from a formula using four variables; plasma creatinine, age, gender and race.
Thirdly, the GFR derived from this formula is actually an estimated GFR . That’s why some labs report it as eGFR. Performing an actual GFR requires injecting a patient with a nuclear isotope (125I-iothalamate) and determining how long it takes for the kidneys to filter it out of the body. This is a procedure that cannot be performed on a large scale to screen patients who are at risk of developing chronic Kidney Disease (CKD).
In 1989 the National Institute of Diabetes and Digestion and Kidney Disease in the U.S. performed a study to see what affect diet could have on CKD and two of the tests used were plasma creatinine and 125I-iothalamate GFR.
As a result, a lot of useful data was produced, allowing researchers to develop a formula to quickly estimate GFR. The result was a magic number of 60 ml/minute/1.73 m^2. As long as the eGFR was 61 or greater, than the risk of CKD was decreased. A eGFR of 60 or less indicated further investigation.
Actually then, all the eGFR does is take the plasma creatinine and present is as a number to indicate kidney function.
But here are some ground rules for using the eGFR.
It cannot be used for paediatric patients eighteen years and under cannot have an eGFR performed on them. The test is gender specific. African Americans have a higher content of muscle, requiring a different formula than non African Americans.
Plasma creatinine must be used. Values from Point of care analyzers measuring creatinine using whole blood cannot use this formula.
In fact, any analyzer that measures plasma creatinine to calculate eGFR has to be standardized. What does that mean?
Finally, any analyzer that measures plasma creatinine has to be standardized. What does that mean?
The purpose of the eGFR is to screen patients at risk of developing CKD. However, if the analyzer being used has a high degree of variability , there is the potential for the patient having a wide range of results. One day your patient could have kidney disease with a eGFR of 58, the next day they could be a healthy 62.
An example of a successful standardization program is the one launched by the province of B.C. in 2003. It provided labs with specimens of known creatinine values for them to analyze. These labs then sent the results to the province, which calculated a correction factor for the analyzer.
The results speak for themselves.
Before the standardization program labs on average were reporting eGFR values 16.5% lower than they actually were. This meant that 535,000 British Columbians had been falsely diagnosed being at risk of developing kidney disease. By having labs use the correction factor in their formula for calculating eGFR, 449,000 patients were prevented from being misdiagnosed. Needless anxiety for the patient was avoided as well as the cost savings. The standardization program cost $335,000 to launch, and has a yearly budget of $135,000. Part of that goes towards educating physicians, patients and allied health professionals on the strengths and weaknesses of the eGFR as a diagnostic tool.
So the eGFR has become a useful diagnostic tool. The important thing is to be aware of its limitations, and not to obsess about its numbers.
Tuesday, July 27, 2010
Explaing the 'e' in eGFR
Labels:
Chronic,
Creatinine,
eGFR,
heart disease,
Institute,
Kidney,
National
Thursday, July 1, 2010
Genetic Strip Mining
My first lesson on neonatal screening made a dramatic impression on me. I was a student at BCIT in 1985 and it was during a lecture in clinical chemistry the PKU test was explained to me. Before this test had been developed, thousands of newborns each year were destined to be warehoused in state run facilities for the mentally impaired with no hope of any sort of normal life. The lecturer had worked in that time frame to see firsthand these patients.
So what happened to change this?
In 1934 Dr. Asbjorn Folling of Norway observed that certain mentally retarded patients smelt strange. Giving in to his curiosity, he discovered the source of the odour was phenylacetic acid present in the patient’s urine. From his observations, Folling concluded that the mental retardation resulted from genetics and diet. Since the source of the phenylacetic acid was from a chemical called phenylketone, and it was present in the urine, the disease was called phenylketonuria (PKU).
So the next step in treating the disease was to determine how this all caused mental retardation in what appeared to be healthy newborns. The answer was that these infants were lacking an enzyme called phenylalanine hydroxylase which breaks down an essential amino acid called phenylanine . Without this enzyme, there is a build up phenylanine levels in the blood, resulting in elevated levels of phenylanine which are harmful to the central nervous system and if left untreated, lead to brain damage. Unable to properly break down the phenylanine, it is then converted to phenylacetic acid, which is then excreted into the urine.
In 1951 Horst Bickel, a German professor, developed a protein drink without any phenylanine in it, allowing for a infant with PKU to receive proper nutrition. Then in 1958, Robert Guthrie developed a way for testing for PKU that was simple and inexpensive. All it involved was placing a few drops of blood on a piece of filter paper. Eight years later in 1966, hospitals began screening for PKU.
In 1986, that was a very powerful message. Laboratory technologists could improve a patient’s outcome with a very simple test. Families were spared the bitter burden of institutionalizing a child.
Since this first neonatal screening, other tests have been included, such as genetic screening for cystic fibrosis and sickle cell anemia.
Unfortunately now events have occurred that have challenged just not PKU testing be done, but all neonatal screening.
In the four decades since the first PKU screening, genetic testing has grown exponentially. A person’s entire genome can be determined from a single drop of blood. Parents are concerned that blood stored on a PKU card can be used to determine that child’s genetic information, and be kept in a private data base. There is an expectation of privacy that blood samples from the millions of PKU cards collected is not turned into some huge ‘genetic strip mine,’ as researchers and private companies take advantage of this huge pool of DNA.
Parents cannot let fear prevent them from having their infants screened for disease. At the same time researcher’s have to be allowed to do research at a molecular level to help with disease diagnosis and treatment. The millions of PKU cards stored in hospitals world wide make a tempting target.
It is for the good of the newborn that government makes newborn screening mandatory, and it is also good that researchers try to find a treatments and cures for disease. In a historical perspective, did the doctor mentioned earlier, Dr. Folling, have to get permission to begin those groundbreaking experiments that led to the discovery of PKU? What if he had been denied that opportunity to find out why those patients had strange smelling urine, how many millions of patients would have had negative outcomes?
True, there has to be a limit to who has access to stored genetic material, whether it be dried blood on filter paper to tissue biopsies from a cancer patient. It is time for politicians to become not only involved, but informed.
What would my solution be? Earlier I used the term ‘genetic strip mining.’ Treat all stored genetic material as a public resource, and any company that develops that resource pay a royalty to the government for that material. Respect of patient’s privacy is paramount
These are interesting times. But we can’t let fear prevent us from making sure children of every generation have a chance to overcome the challenges they face in this business we call life, including the DNA they are born with.
So what happened to change this?
In 1934 Dr. Asbjorn Folling of Norway observed that certain mentally retarded patients smelt strange. Giving in to his curiosity, he discovered the source of the odour was phenylacetic acid present in the patient’s urine. From his observations, Folling concluded that the mental retardation resulted from genetics and diet. Since the source of the phenylacetic acid was from a chemical called phenylketone, and it was present in the urine, the disease was called phenylketonuria (PKU).
So the next step in treating the disease was to determine how this all caused mental retardation in what appeared to be healthy newborns. The answer was that these infants were lacking an enzyme called phenylalanine hydroxylase which breaks down an essential amino acid called phenylanine . Without this enzyme, there is a build up phenylanine levels in the blood, resulting in elevated levels of phenylanine which are harmful to the central nervous system and if left untreated, lead to brain damage. Unable to properly break down the phenylanine, it is then converted to phenylacetic acid, which is then excreted into the urine.
In 1951 Horst Bickel, a German professor, developed a protein drink without any phenylanine in it, allowing for a infant with PKU to receive proper nutrition. Then in 1958, Robert Guthrie developed a way for testing for PKU that was simple and inexpensive. All it involved was placing a few drops of blood on a piece of filter paper. Eight years later in 1966, hospitals began screening for PKU.
In 1986, that was a very powerful message. Laboratory technologists could improve a patient’s outcome with a very simple test. Families were spared the bitter burden of institutionalizing a child.
Since this first neonatal screening, other tests have been included, such as genetic screening for cystic fibrosis and sickle cell anemia.
Unfortunately now events have occurred that have challenged just not PKU testing be done, but all neonatal screening.
In the four decades since the first PKU screening, genetic testing has grown exponentially. A person’s entire genome can be determined from a single drop of blood. Parents are concerned that blood stored on a PKU card can be used to determine that child’s genetic information, and be kept in a private data base. There is an expectation of privacy that blood samples from the millions of PKU cards collected is not turned into some huge ‘genetic strip mine,’ as researchers and private companies take advantage of this huge pool of DNA.
Parents cannot let fear prevent them from having their infants screened for disease. At the same time researcher’s have to be allowed to do research at a molecular level to help with disease diagnosis and treatment. The millions of PKU cards stored in hospitals world wide make a tempting target.
It is for the good of the newborn that government makes newborn screening mandatory, and it is also good that researchers try to find a treatments and cures for disease. In a historical perspective, did the doctor mentioned earlier, Dr. Folling, have to get permission to begin those groundbreaking experiments that led to the discovery of PKU? What if he had been denied that opportunity to find out why those patients had strange smelling urine, how many millions of patients would have had negative outcomes?
True, there has to be a limit to who has access to stored genetic material, whether it be dried blood on filter paper to tissue biopsies from a cancer patient. It is time for politicians to become not only involved, but informed.
What would my solution be? Earlier I used the term ‘genetic strip mining.’ Treat all stored genetic material as a public resource, and any company that develops that resource pay a royalty to the government for that material. Respect of patient’s privacy is paramount
These are interesting times. But we can’t let fear prevent us from making sure children of every generation have a chance to overcome the challenges they face in this business we call life, including the DNA they are born with.
Labels:
cystic fibrosis,
genetic,
genetic strip mining,
neonatal,
PKU,
screening,
sickle cell anemia,
testing
Friday, April 16, 2010
A giant step for medicine
One day future medical laboratory technologists will thank Barak Obama for setting in motion a manned space mission to Mars.
Why?
Because the last time man stepped on another celestial body (the moon), the payoffs for laboratory medicine were many. By pushing for the miniaturization of electrical components such as transistors, medical laboratory science was able to transfer the technolgy to developing analzyers that were becoming more and more sophisticated. The technology has also helped develop CAT and MRI scans as well.
Without doubt, there will be challenges. But the knowledge gained from overcoming these obstacles will have long term positive outcomes for medicine. Who knows what new technolgies and advancement will be developed (don't be surprised to see fantastic developments in nanotechnolgy).
Maybe they will even develop that holy grail of science fiction, the tricorder.
Why?
Because the last time man stepped on another celestial body (the moon), the payoffs for laboratory medicine were many. By pushing for the miniaturization of electrical components such as transistors, medical laboratory science was able to transfer the technolgy to developing analzyers that were becoming more and more sophisticated. The technology has also helped develop CAT and MRI scans as well.
Without doubt, there will be challenges. But the knowledge gained from overcoming these obstacles will have long term positive outcomes for medicine. Who knows what new technolgies and advancement will be developed (don't be surprised to see fantastic developments in nanotechnolgy).
Maybe they will even develop that holy grail of science fiction, the tricorder.
Labels:
Mars,
nanotechnolgy,
NASA,
Obama,
transistors,
tricorder
Sunday, January 10, 2010
Platelet Rich Plasma
In honour of the Olympics this blog will be dedicated to the topic of Platelet Rich Plasma (PRP). Lately PRP has been getting much media attention as a new miracle cure for athletic injuries. So what exactly is PRP, and how does it work?
If you were to take a tube of unclotted blood and spin it in a centrifuge, it would separate into two portions, a liquid portion called plasma, and a solid portion made up of cellular elements. The cellular portion would be divided into three categories. The largest portion would be the Red Blood Cells (RBC), responsible for carrying oxygen to the tissues. The next portion would be the White Blood Cells (WBC), the cells responsible for the body’s immune system. The final type of cells found would be the platelets, the cells responsible for blood clotting.
It is because of this function that platelets have been getting a lot of bad press lately. After all, circulatory disease is the major killer in the industrialized world, all caused by the blood flow to tissue interrupted by the formation of platelet clotting. A lot of treatment such as aspirin and plavix, is based on reducing the clotting activity of platelets, so why would someone want to inject plasma that has an increased concentration of platelets?
Because despite the fact that platelet clumping can lead to death, platelets do more than stop blood flow when they are activated.
After all, without platelets we would all die from bleeding to death. When tissue is damaged, it releases chemicals that activate platelets to clump and form a plug to prevent blood from leaking out. Once activated, platelets will also release chemicals from granules inside of them. These chemicals help the body to start repair the damage to the tissue.
Two of the most important activators are Platelet Derived Growth Factor (PDGF) and Transforming Growth Factor Beta (TGF-β). PDGF attracts WBC, fibroblasts and smooth muscle cells to the injury site. Once there these cells begin tissue repair by the formation of new cells, capillary networks that increase blood flow to the damaged tissue and fibronectin, a cellular super glue that holds all of this together. TGF-β also attracts cells to the damaged tissue and also causes cells to increase production of collagen and fibronectin.
In 1986 researchers l decided to see if platelets could be used to heal tissue. By centrifuging blood at a certain speed, WBC and RBC’s will sink to the bottom, while leaving platelets to float in the plasma at increased levels. The plasma, now rich with platelets is then removed. By adding chemicals to activate the platelets to begin clotting, a gel forms that can be applied to wounded tissue to speed up repair. This has been used successfully in different types of surgery to speed up recovery.
Recently though, doctors in sports medicine have decided trying injecting the PRP straight into the damaged area, usually a soft tissue injury such as Achilles tendonitis. Many famous athletes have received this type of treatment with resounding success.
Recent studies have shown this to be due mostly to a placebo effect. Researchers injected patients with either saline or PRP, and received the same results. Their conclusion was that using PRP for soft tissue injury caused by a sports injury was useless.
However using PRP by activating the platelets to form a gel has been found to help patients recover from surgery or unhealed ulcers.
If you were to take a tube of unclotted blood and spin it in a centrifuge, it would separate into two portions, a liquid portion called plasma, and a solid portion made up of cellular elements. The cellular portion would be divided into three categories. The largest portion would be the Red Blood Cells (RBC), responsible for carrying oxygen to the tissues. The next portion would be the White Blood Cells (WBC), the cells responsible for the body’s immune system. The final type of cells found would be the platelets, the cells responsible for blood clotting.
It is because of this function that platelets have been getting a lot of bad press lately. After all, circulatory disease is the major killer in the industrialized world, all caused by the blood flow to tissue interrupted by the formation of platelet clotting. A lot of treatment such as aspirin and plavix, is based on reducing the clotting activity of platelets, so why would someone want to inject plasma that has an increased concentration of platelets?
Because despite the fact that platelet clumping can lead to death, platelets do more than stop blood flow when they are activated.
After all, without platelets we would all die from bleeding to death. When tissue is damaged, it releases chemicals that activate platelets to clump and form a plug to prevent blood from leaking out. Once activated, platelets will also release chemicals from granules inside of them. These chemicals help the body to start repair the damage to the tissue.
Two of the most important activators are Platelet Derived Growth Factor (PDGF) and Transforming Growth Factor Beta (TGF-β). PDGF attracts WBC, fibroblasts and smooth muscle cells to the injury site. Once there these cells begin tissue repair by the formation of new cells, capillary networks that increase blood flow to the damaged tissue and fibronectin, a cellular super glue that holds all of this together. TGF-β also attracts cells to the damaged tissue and also causes cells to increase production of collagen and fibronectin.
In 1986 researchers l decided to see if platelets could be used to heal tissue. By centrifuging blood at a certain speed, WBC and RBC’s will sink to the bottom, while leaving platelets to float in the plasma at increased levels. The plasma, now rich with platelets is then removed. By adding chemicals to activate the platelets to begin clotting, a gel forms that can be applied to wounded tissue to speed up repair. This has been used successfully in different types of surgery to speed up recovery.
Recently though, doctors in sports medicine have decided trying injecting the PRP straight into the damaged area, usually a soft tissue injury such as Achilles tendonitis. Many famous athletes have received this type of treatment with resounding success.
Recent studies have shown this to be due mostly to a placebo effect. Researchers injected patients with either saline or PRP, and received the same results. Their conclusion was that using PRP for soft tissue injury caused by a sports injury was useless.
However using PRP by activating the platelets to form a gel has been found to help patients recover from surgery or unhealed ulcers.
Sunday, December 20, 2009
Adiponectin- Unlikely hero in the body
Adiponectin- Hormone responsible for putting out small fires.
The body is constantly under attack from the abuse of living. Internally there are all these injuries occurring at the molecular level, causing damage that spreads like wildfire until it is out of control. Metabolic syndrome is an example of this. What is it? This is a term used to describe a group of conditions that put patients at higher risk of developing type two diabetes and/or heart disease. Patients having three or more of the following are considered to have Metabolic Syndrome:
- High fasting glucose (greater than 5.6 mmol/L)
- High Blood Pressure (130/85 of higher)
- High Triglyceride (> 1.7 mmol/L)
- Decreased HDL (<1.0 in men, <1.3 in women) (A trick to remember which cholesterol is which is that HDL is the Healthy one, LDL is the Lousy one )
- Abdominal obesity or too much fat around the waist (>102 cm (40 inches) for men, >88 cm (35 inches) for women)
There is some debate on what causes Metabolic Syndrome. Is it due to increased insulin resistance, genetics, old age or life style? Wouldn’t it be nice if there was a diligent fulltime fighter in the human body that went around putting out the little brush fires these conditions caused before they became five alarm threats to our health?
Researchers have discovered a hormone that does this and it is from an unlikely source.
After studying the cells found in adipose (fatty) tissue called adipocytes, it was discovered that forty percent of the expressed genes were unknown or novel genes, even the gene that was most abundant and specific for an adipocyte. Further research identified a protein produced by adipocytes termed adiponectin.
Logic would determine that the more adipose tissue you have the more adipocytes present would result in increased levels of adiponectin, correct?
Wrong. To everyone’s surprise the higher the body mass index (BMI) the lower the adiponectin levels. Healthy patients with low levels of body fat had higher levels of adiponectin. Patients with Metabolic Syndrome and diabetes also had low levels of adiponectin.
So what does adiponectin exactly do?
It can help prevent atherosclerosis in blood vessels. Atherosclerosis is damage to cell walls caused by accumulation of fatty materials, such as LDL cholesterol and white blood cells called monocytes. Adiponectin has been found to prevent this kind of cell wall damage by having inhibitory affect against molecules that cause LDL and monocytes from sticking to vessel walls. This protein once secreted by adipocytes enters into the blood stream and looking for damaged cells lining vessel walls to repair, putting out small fires before they burn out of control.
The bad news is that measurement of adiponectin is not routinely done in the medical lab... yet. But when it does, it will help determine which patients will be at risk of developing Metabolic Syndrome. A good article to read about this can be found at :
http://atvb.ahajournals.org/cgi/reprint/24/1/29
Thank you for taking the time to read my posting. I look forward to your thoughts and comments.
Regards,
Mark Hawkins
The body is constantly under attack from the abuse of living. Internally there are all these injuries occurring at the molecular level, causing damage that spreads like wildfire until it is out of control. Metabolic syndrome is an example of this. What is it? This is a term used to describe a group of conditions that put patients at higher risk of developing type two diabetes and/or heart disease. Patients having three or more of the following are considered to have Metabolic Syndrome:
- High fasting glucose (greater than 5.6 mmol/L)
- High Blood Pressure (130/85 of higher)
- High Triglyceride (> 1.7 mmol/L)
- Decreased HDL (<1.0 in men, <1.3 in women) (A trick to remember which cholesterol is which is that HDL is the Healthy one, LDL is the Lousy one )
- Abdominal obesity or too much fat around the waist (>102 cm (40 inches) for men, >88 cm (35 inches) for women)
There is some debate on what causes Metabolic Syndrome. Is it due to increased insulin resistance, genetics, old age or life style? Wouldn’t it be nice if there was a diligent fulltime fighter in the human body that went around putting out the little brush fires these conditions caused before they became five alarm threats to our health?
Researchers have discovered a hormone that does this and it is from an unlikely source.
After studying the cells found in adipose (fatty) tissue called adipocytes, it was discovered that forty percent of the expressed genes were unknown or novel genes, even the gene that was most abundant and specific for an adipocyte. Further research identified a protein produced by adipocytes termed adiponectin.
Logic would determine that the more adipose tissue you have the more adipocytes present would result in increased levels of adiponectin, correct?
Wrong. To everyone’s surprise the higher the body mass index (BMI) the lower the adiponectin levels. Healthy patients with low levels of body fat had higher levels of adiponectin. Patients with Metabolic Syndrome and diabetes also had low levels of adiponectin.
So what does adiponectin exactly do?
It can help prevent atherosclerosis in blood vessels. Atherosclerosis is damage to cell walls caused by accumulation of fatty materials, such as LDL cholesterol and white blood cells called monocytes. Adiponectin has been found to prevent this kind of cell wall damage by having inhibitory affect against molecules that cause LDL and monocytes from sticking to vessel walls. This protein once secreted by adipocytes enters into the blood stream and looking for damaged cells lining vessel walls to repair, putting out small fires before they burn out of control.
The bad news is that measurement of adiponectin is not routinely done in the medical lab... yet. But when it does, it will help determine which patients will be at risk of developing Metabolic Syndrome. A good article to read about this can be found at :
http://atvb.ahajournals.org/cgi/reprint/24/1/29
Thank you for taking the time to read my posting. I look forward to your thoughts and comments.
Regards,
Mark Hawkins
Labels:
Adiponectin,
atherosclerosis,
Cholesterol,
diabetes,
HDL,
heart disease,
LDL,
metabolic syndrome
Wednesday, December 2, 2009
Tight Glycemic Control
One of the complications of critically ill patients is developing hyperglycemia, high glucose levels, even when they are not diabetic. Controlling the glucose level presents an added challenge to the care of these patients since it has been postulated that high glucose levels can lead to more complications such as septicemia, neuropathy and death. The conventional therapeutic approach has been to monitor the patients glucose and to treat when the plasma glucose level is greater than 11.9 mmol/L (215mg/dL) by insulin infusion and then try to maintain it between 10.0 to 11.0 mmol/L (180-200 mg/dL). Although this does work, the question has been would trying to have tighter glycemic control improve patient outcomes?
In 2001, the New England Journal of Medicine published a study by Dr. Greet Van den Berghe that tried to answer that question. In her study (which can be found at:
http://content.nejm.org/cgi/content/short/345/19/1359) 1548 ICU patients receiving mechanical ventilation were given intense insulin therapy, put insulin infusion to maintain their glucose levels between 4.4-6.0 mmol/L (80-110 mg/dl). Glucose was measured using whole blood with an ABL700 at 1 to 4 hour intervals.
The results were impressive. Bloodstream infections were reduced by 46%, acute renal failure by 41%, RBC transfusions by 50% and critical illness polyneuropathy by 44%. Most impressive of all was the mortality rate being halved from 8.0% to 4.6%.
With such positive patient outcomes, many hospitals have started using tight glycemic control and have reported similar impressive results. Logic would conclude that tight glycemic works.
However in March 2009 the NEJM released a the results of the NICE-SUGAR (Normoglycemic in Intensive Care-Survival Using Glucose Algorithm Regulation) study that evaluated hospitals using tight glycemic control on critically ill patients.
It was a huge multinational study that examined the outcomes of 6000 patients. In this study (http://content.nejm.org/cgi/content/short/360/13/1283?ssource=mfv), the opposite was found, that tight glycemic control did not decrease mortality, but increased it.
So where does that leave the critically ill patient? The day after the NICE-SUGAR study was released, a joint statement was released by the American Diabetes Association and the American Association of Clinical Endocrinologists stating that tight glycemic control should not be abandoned, but be up to the clinician on whether or not the patient would benefit from it. This statement can be found at:
http://newswise.com/articles/joint-statement-on-the-nice-sugar-study-on-intensive-versus-conventional-glucose-control-in-critically-ill-patients?ret=/articles/list&category=latest&page=1&search[billing_institution_id]=0&search[date_range]=&search[institution_name
Dr. Van den Berghe’s groundbreaking study has shown that keeping glucose levels tightly controlled can improve patient outcomes while the NICE-SUGAR study illustrates that before a hosptital jumps on the tight glycemic control bandwagon it has to have the infrastructure in place. The cornerstone to this is accurate and precise measurement of blood glucose levels. In order for this to happen, the lab will have to be involved. Why?
On the surface it looks like the lab has no role in tight glycemic control since it is done by nurses. All that is required is for a single drop of blood placed on a test strip that is then stuck in the small bedside monitor and wait for the result to be displayed. It doesn’t get any simpler than that.
The same could be said about driving a car. There have been news stories of children as young as five taking their parents car for a spin down the freeway. That doesn’t mean the driving age should be dropped to six. The same could be said for any health care professional performing bedside glucose monitoring. Before anyone can operate them, they have to be trained on not only how to use them, but the institution’s standard operating procedures as well. This is where the lab can be a vital resource, providing training and making sure that the instruments used are properly working.
If anyone involved in the NICE-SUGAR is reading this posting, would it be possible if you could answer this simple question regarding the operators. Was there any external proficiency testing done?
What is external proficiency testing? It is standard in all labs, where an outside agency sends specimens that the lab has to analyze and send the result in. The labs performance is then evaluated depending on how accurate the results were. If the lab consistently produces good results, it passes. However if the lab is inconsistent in its performance, its accreditation can be taken away.
External proficiency testing is not cheap. But neither is quality.
Another reason the lab can be involved is if the doctor wants to check other analytes, such as electrolytes and ketone bodies. Patients who are dehydrated and are on certain medications can have spurious results on the bedside glucose monitors, and may require a different analyzer to measure their glucose level.
Some labs actually do the bedside glucose testing with very good consistent results. An example of this is the program run by Brenda Franks at Nebraska Methodist Hospital in Omaha where phlebotomists do the bedside glucose testing. An article on their success can be found at:
http://www.cap.org/apps/cap.portal?_nfpb=true&cntvwrPtlt_actionOverride=%2Fportlets%2FcontentViewer%2Fshow&_windowLabel=cntvwrPtlt&cntvwrPtlt%7BactionForm.contentReference%7D=cap_today%2F0909%2F0909f_POC_leader_spreads.html&_state=maximized&_pageLabel=cntvwr
Bottom line, tight glycemic controls works, but it’s not perfect. The lab should be used as an excellent resource if the organization wants to pursue this for their critically sick patients.
In 2001, the New England Journal of Medicine published a study by Dr. Greet Van den Berghe that tried to answer that question. In her study (which can be found at:
http://content.nejm.org/cgi/content/short/345/19/1359) 1548 ICU patients receiving mechanical ventilation were given intense insulin therapy, put insulin infusion to maintain their glucose levels between 4.4-6.0 mmol/L (80-110 mg/dl). Glucose was measured using whole blood with an ABL700 at 1 to 4 hour intervals.
The results were impressive. Bloodstream infections were reduced by 46%, acute renal failure by 41%, RBC transfusions by 50% and critical illness polyneuropathy by 44%. Most impressive of all was the mortality rate being halved from 8.0% to 4.6%.
With such positive patient outcomes, many hospitals have started using tight glycemic control and have reported similar impressive results. Logic would conclude that tight glycemic works.
However in March 2009 the NEJM released a the results of the NICE-SUGAR (Normoglycemic in Intensive Care-Survival Using Glucose Algorithm Regulation) study that evaluated hospitals using tight glycemic control on critically ill patients.
It was a huge multinational study that examined the outcomes of 6000 patients. In this study (http://content.nejm.org/cgi/content/short/360/13/1283?ssource=mfv), the opposite was found, that tight glycemic control did not decrease mortality, but increased it.
So where does that leave the critically ill patient? The day after the NICE-SUGAR study was released, a joint statement was released by the American Diabetes Association and the American Association of Clinical Endocrinologists stating that tight glycemic control should not be abandoned, but be up to the clinician on whether or not the patient would benefit from it. This statement can be found at:
http://newswise.com/articles/joint-statement-on-the-nice-sugar-study-on-intensive-versus-conventional-glucose-control-in-critically-ill-patients?ret=/articles/list&category=latest&page=1&search[billing_institution_id]=0&search[date_range]=&search[institution_name
Dr. Van den Berghe’s groundbreaking study has shown that keeping glucose levels tightly controlled can improve patient outcomes while the NICE-SUGAR study illustrates that before a hosptital jumps on the tight glycemic control bandwagon it has to have the infrastructure in place. The cornerstone to this is accurate and precise measurement of blood glucose levels. In order for this to happen, the lab will have to be involved. Why?
On the surface it looks like the lab has no role in tight glycemic control since it is done by nurses. All that is required is for a single drop of blood placed on a test strip that is then stuck in the small bedside monitor and wait for the result to be displayed. It doesn’t get any simpler than that.
The same could be said about driving a car. There have been news stories of children as young as five taking their parents car for a spin down the freeway. That doesn’t mean the driving age should be dropped to six. The same could be said for any health care professional performing bedside glucose monitoring. Before anyone can operate them, they have to be trained on not only how to use them, but the institution’s standard operating procedures as well. This is where the lab can be a vital resource, providing training and making sure that the instruments used are properly working.
If anyone involved in the NICE-SUGAR is reading this posting, would it be possible if you could answer this simple question regarding the operators. Was there any external proficiency testing done?
What is external proficiency testing? It is standard in all labs, where an outside agency sends specimens that the lab has to analyze and send the result in. The labs performance is then evaluated depending on how accurate the results were. If the lab consistently produces good results, it passes. However if the lab is inconsistent in its performance, its accreditation can be taken away.
External proficiency testing is not cheap. But neither is quality.
Another reason the lab can be involved is if the doctor wants to check other analytes, such as electrolytes and ketone bodies. Patients who are dehydrated and are on certain medications can have spurious results on the bedside glucose monitors, and may require a different analyzer to measure their glucose level.
Some labs actually do the bedside glucose testing with very good consistent results. An example of this is the program run by Brenda Franks at Nebraska Methodist Hospital in Omaha where phlebotomists do the bedside glucose testing. An article on their success can be found at:
http://www.cap.org/apps/cap.portal?_nfpb=true&cntvwrPtlt_actionOverride=%2Fportlets%2FcontentViewer%2Fshow&_windowLabel=cntvwrPtlt&cntvwrPtlt%7BactionForm.contentReference%7D=cap_today%2F0909%2F0909f_POC_leader_spreads.html&_state=maximized&_pageLabel=cntvwr
Bottom line, tight glycemic controls works, but it’s not perfect. The lab should be used as an excellent resource if the organization wants to pursue this for their critically sick patients.
Monday, November 2, 2009
The future of the INR
Traditional anticoagulants such as warfarin and heparin prevent clot formation by interfering with the ability of proteins in the blood called clotting factors to function. Dabigatran is a new class of anticoagulants called thrombin inhibitors that work by interfering with the stage of clot formation, preventing the protein thrombin from converting fibrinogen to fibrin. So how will this affect the medical laboratory?
The warfarin industry is a billion dollar industry and part of the cost is constantly monitoring its therapeutic affect on the patient. The test used to monitor warfarin therapy is called the the INR (short for International Ratio). A low INR means that the dose is to low, there is an increased risk of clots forming. A high INR means there is a risk of uncontrolled bleeding. Both situations require time and resources to bring the INR back into a therapeutic range.
Since Dabigatran does not require any blood tests to monitor it, the INR could become obsolete, and with it the entire lab infrastructure that goes into doing this test.
This is not the first time an established anticoagulant therapy was replaced by a better replacement. For years a drug called heparin was used to treat blood clots. Like warfarin, it needed to be constantly monitored by a lab test called the APTT and had to be administered intravenously in a hospital setting. Then ten years ago Lovenox (also known as low molecular heparin), was released. It had the benefits of being injectable, not requiring constant
But if dagibatrin replaces warfarin as a treatment for the prevention of blood clots, the patient no longer needs to have any monitoring done, and the lab loses a client. Can you blame the patient though? No more having to go to the lab to be poked and prodded, over time veins become scarred resulting in it become harder and more painful to get blood specimens for the INR test. Dagibatrin has none of the dietary restrictions that warfarin has.
It stands as a testament to warfarin’s contribution as a cost effective way of improving people’s lives that it has taken over 50 years before a suitable replacement might be found.
How can labs face the challenges dabigatrin present?
Even if dagibatran replaces warfarin, it will not make it or the INR test obsolete. Like all drugs, dagibatran cannot be taken by everyone. Heartburn was the major non-bleeding related side effect, so severe that it had to be discontinued.
The long term affects of taking dagibatran have not been studied as well. How will the human body be affected by taking this drug for six months, 1 year 1 decade? Possible, dagibatran will have some minor side effects that will require some lab testing. An example of this would be how lipitor now requires liver function testing to be done.
Another side effect was there being a very small but statistically significant increase in the risk of heart attack for some patients. Why is still being investigated, but it could require some sort of lab test to screen patients that are in this risk group, possible a future gene test designed for this.
Even if the patient has none of the above problems, as with any anticoagulant therapy there is a chance of uncontrolled bleeding. It will be up to the lab to determine if the bleeding is due to dagibatran. This will be done by using either of two tests, thrombin time (TT) or ecarin clotting time (ECT). Medical labs will need to add these two tests to physicians if dagibatran use increases.
It is quite possible that warfarin may become obsolete, and this will have a huge impact on Medical Laboratories.
But it is only by understanding what these changes are and preparing for them that labs will be able to survive and even thrive.
Thank you for taking the time to read my posting. I look forward to your thoughts and comments.
Regards,
Mark Hawkins
The warfarin industry is a billion dollar industry and part of the cost is constantly monitoring its therapeutic affect on the patient. The test used to monitor warfarin therapy is called the the INR (short for International Ratio). A low INR means that the dose is to low, there is an increased risk of clots forming. A high INR means there is a risk of uncontrolled bleeding. Both situations require time and resources to bring the INR back into a therapeutic range.
Since Dabigatran does not require any blood tests to monitor it, the INR could become obsolete, and with it the entire lab infrastructure that goes into doing this test.
This is not the first time an established anticoagulant therapy was replaced by a better replacement. For years a drug called heparin was used to treat blood clots. Like warfarin, it needed to be constantly monitored by a lab test called the APTT and had to be administered intravenously in a hospital setting. Then ten years ago Lovenox (also known as low molecular heparin), was released. It had the benefits of being injectable, not requiring constant
But if dagibatrin replaces warfarin as a treatment for the prevention of blood clots, the patient no longer needs to have any monitoring done, and the lab loses a client. Can you blame the patient though? No more having to go to the lab to be poked and prodded, over time veins become scarred resulting in it become harder and more painful to get blood specimens for the INR test. Dagibatrin has none of the dietary restrictions that warfarin has.
It stands as a testament to warfarin’s contribution as a cost effective way of improving people’s lives that it has taken over 50 years before a suitable replacement might be found.
How can labs face the challenges dabigatrin present?
Even if dagibatran replaces warfarin, it will not make it or the INR test obsolete. Like all drugs, dagibatran cannot be taken by everyone. Heartburn was the major non-bleeding related side effect, so severe that it had to be discontinued.
The long term affects of taking dagibatran have not been studied as well. How will the human body be affected by taking this drug for six months, 1 year 1 decade? Possible, dagibatran will have some minor side effects that will require some lab testing. An example of this would be how lipitor now requires liver function testing to be done.
Another side effect was there being a very small but statistically significant increase in the risk of heart attack for some patients. Why is still being investigated, but it could require some sort of lab test to screen patients that are in this risk group, possible a future gene test designed for this.
Even if the patient has none of the above problems, as with any anticoagulant therapy there is a chance of uncontrolled bleeding. It will be up to the lab to determine if the bleeding is due to dagibatran. This will be done by using either of two tests, thrombin time (TT) or ecarin clotting time (ECT). Medical labs will need to add these two tests to physicians if dagibatran use increases.
It is quite possible that warfarin may become obsolete, and this will have a huge impact on Medical Laboratories.
But it is only by understanding what these changes are and preparing for them that labs will be able to survive and even thrive.
Thank you for taking the time to read my posting. I look forward to your thoughts and comments.
Regards,
Mark Hawkins
Subscribe to:
Posts (Atom)