E-mail: middletonjohn474@gmail.com
• Reference: Michael S. Poirier, President & CEO
• Reference: Kendal Jensen, M.D., Ph.D.
• Reference: Karen MacLeod, Executive Vice President
Thanks for taking the time to check out my Bio. In addition to offering detail and context that a resume cannot provide, I have written up some additional material which I hope you find interesting.
I have spent a 20-plus year career studying, developing, and producing analytical systems. Of course, this covers a gamut of opportunities. During this time, I was involved in a number of aspects of pure research, assay development, system development, transfer to production, operations, quality, and line support. Of course, to accomplish anything it is rarely I and almost always we. From justifying changes to a production to process to peer reviewed scientific journals and everything in between, a significant amount of writing was involved. Often times, of course, this also involved presenting work and leading projects comprised of cross functional teams; effective communication is needed to foster change will agility and effectiveness. I believe to be a high professional, in any day or age, solid communication is paramount.
As I highlight some of what I perceive to be some of the more interesting aspects of my career, I have taken great care to honor the intellectual property concerns of those involved. Also included in this restraint are various business considerations. As such, I can only describe experience that cannot have any adverse effects to the wonderful organizations that I have had to privilege to be a part of.
Also, it is important to note that in order to properly tell my story I have to mention some of the people I’ve met along the way. Yes, to some extent this is name dropping, however, I would not be the professional I am today without the people I reference.
Lastly, although the majority of the content of this Bio involves technical notes, I must acknowledge that much of the work in the regulated environments that I have operated in require a great deal of what I call Quality System Management activities. In other words, behind implementation for all of the changes that we accomplished were numerous reports (technical and otherwise), procedure documents, design reviews, schedules, requirement documents, test plans, risk management analysis, etc., that were performed and filed on behalf of the projects.
Since university researchers are typically budget constrained and I was a work-study student (my wage was partially paid by the university), I immediately realized many opportunities to work in the sciences at UCR. My very first exposure began when I started working for professor Leland Shannon in the biochemistry department. My first job was to bleed rabbits that had been immunized with a class of plant proteins called lectins. I injected them, took the blood samples, and tittered the serum (ouchterlony). Since I became proficient at this, I was lent out to other researchers to perform these duties. I also washed glassware and raised KDR drosophila (pesticide resistant strain) for entomologists doing pesticide research. It turns out those dishes had to be exceptionally clean since they were doing research on a class of chemicals known as pyrethroids; multiple acetone rinses were involved. For a different project, I ground up liquid nitrogen frozen boll weevils and performed quantitative lipid extraction as a function of boll weevil group. While helping professor Dan Arp harvest root nodules for his research on nitrogen fixation I made a good friend in his post-doc Dr. Michael Hyman. Through discussions with Mike, I got a glimpse into the mind of a scientist. Previously I simply took my marching orders from the various professor’s lab leaders and performed the tasks assigned. Although the work in biochemistry, plant pathology, and entomology departments was interesting and I certainly met a number of amazing people, I knew based on this exposure that analytical chemistry better suited me. Also, fortunately for me, during this stint I spent some time outside of the lab with some amazing people including Walter Turkowski, Paul Marella, and Lee Spence; perhaps by some human osmosis, just a bit of them rubbed off on me.
My first paying job outside of college was as a contract technician (Ad-Tek Engineering) for the systems test lab at Beckman instruments in Brea, California. I landed this position due to a referral that I received from my good friend Dr. Walter Turkowski; he was leaving the position to go to medical school. This is where I got my start in the IVD industry.
My first responsibilities were to run line support experiments for the Astra instrument. Here a pre-defined set samples were tested for a pre-defined set of assays. I was not designing experiments at this juncture. Since there was considerable down time between experiments, I took advantage of the opportunity to familiarize myself with this strange and wonderful thing known as a personal computer. I started learning Lotus 1-2-3 using the communal IBM machines (XT and AT models) and got some of my first tastes of data analysis. Based on my performance at this position I earned a promotion and was hired as a Senior Technician permanent employee. I later got involved in Synchron CX3 and Lablyte development. At the 1988 AACC conference poster session my name first entered the light when it appeared in the authors list for an abstract on Lablyte analytical performance. Of course, I was included because I made a number of the measurements that were presented.
Following that I became involved in CX5 and CX7 development. Learning the details on how these advanced systems functioned was intellectually stimulating and quite fun to say the least. Also, at that time I began to work on more advanced data analysis and visualization tools using RS/1. Boxplots and multidimensional scatter plots were among the first that were used. I worked for Joe Kaufman at the time I decided to leave that position. He had given me a perfect performance review in hopes of promoting me which was rejected by levels above. He was a good scientist and a good boss. I also befriended co-worker Jerzy Tomasik during this stint at Beckman. To this day he is one of the smartest people I have ever met.
My scientific career advanced when I worked for professor Eric Chronister at UCR. He is a spectroscopist interested in exploring ultra-fast transition phenomena. Although much of his research was over my head (and still is) I am grateful for the opportunity he gave me to work in his lab. As an undergraduate member of the group I built a simple spectrophotometer used to make time resolved fluorescence anisotropy measurements, made measurements, and presented data at group meetings and poster sessions. One such event was at a Western Spectroscopy Association meeting and later published in SPIE. I went to this meeting as the only undergraduate attendee, what a blast! The idea that fundamental energy transitions can be observed in real-time fascinated me. A career in Science was for me. Dr. Robert Crowell, a high level spectroscopist (team leader at multiple national labs), and I are friends to this day.
Unlike previous sections offered in this Bio, I have grouped activities based on subject matter rather than chronologically.
Over the course of my Beckman Coulter career I was involved with system integration, optimization, verification and validation activities for the Astra, Lablyte, CX3, CX5, CX7, Immage, LX20 and DXC assay platforms. This process involved iterative improvements in hardware, software, reagent formulation, assay data handling, and value assignment. To assess performance, the data generated was not only reconciled with design requirements but was also compared to reference data sets. As an example, one important feature we implemented was cap piercing for the DxC instrument. Rather than requiring the user to remove the blood collection tube cap, the instrument used a specialized knife to slice the cap in a manner that allowed the sample probe to penetrate without issue. For this, copious amounts of data were required to verify the reliability of our successful implementation. For many of the other project verifications and validations, analysis of precision, linearity, methods comparison, etc. were also included with regulatory submissions.
As a member of material review boards, I participated in problem solving for reagent and instrument production lines. Here we designed and performed experiments to justify deviations and support production process changes. Changes included specification and testing refinements as well as hardware and materials updates. As highlighted below, because of the natural lulls in line support activities, I was able to pursue independent research projects, write journal articles, and develop patentable technologies in support of organizational goals.
When I was first hired or re-hired (yes, I got that promotion from senior technician to associate scientist), I worked for one of my mentors, Dr. Art Kessner (brilliant). I continued learning the science of making analytical measurements. Specifically, it is under his tutelage that I began to learn the science of value assignment. One of my first projects was to complete the task of writing up a large control value assignment study that was performed as part of the Synchron CX7 product release. It was here that I continued my exploration of the meanings of variance, specifically, the notion that total variance can be characterized by considering the components of variance. To the greatest extent the ranges published in control inserts reflect the total variation of the assay system and the error in the estimate of the target value. What is also interesting is that ranges must often be empirically adjusted to not only consider limitations in the data sets used to perform the statistics, but also special sources of variation such as analyte degradation, change in precision profiles over time, and even marketing concerns. In short, it is trickier than it seems to develop a good control range.
Later on, I developed a computationally simple method to assign control target values by considering the values from other methods. Here the weights of the other methods can be used to estimate targets. This can be particularly useful when a control is assigned for a specific analyte across multiple measurement platforms. This utility is amplified when it is used to assign for platforms that are difficult to obtain measurements for. Perhaps that is why Beckman decided to go all the way to Patent for this invention. The reference is, United States Patent 7312083, Control value assignment method: A method of assigning expected recovery values to control substances used in analytical testing is disclosed.
Although control value assignment is interesting, important, and nuanced, it is not as fun a calibrator value assignment. Calibrator value assignment integrates with many aspects of system design directly: assay signal processing, calibration, and links to external accuracy bases to name a few. Dr. Jeff Vaks, another mentor of mine, and I wrote a Clinical Chemistry journal article describing one particular value assignment project. The paper is titled, Evaluation of assigned-value uncertainty for complex calibrator value assignment processes: a prealbumin example.
There are a number of interesting aspects of this analysis. Here not only was an alternative method (relative to a method described by Schlain B.) of transferring values from a single level standard to a set of internal calibrators described, but the assigned value uncertainty was characterized by using Monte Carlo simulation and matrix math to determine the contribution of covariance to the process. Over the years working together I learned a great deal from Dr. Vaks.
This activity was part of a larger IVDD compliance project to characterize the calibrator uncertainty for all of the general chemistry and immunoassay products. Under the direction of my supervisor at the time, Dr. Stephen Alter (great boss), I successfully completed this task. As part of this project I was sent to Belgium to work with academics and other manufacturers on compliance strategy. This is the first time, and hopefully will not be the last, that I flew first class.
Years before I began exploring the use of Monte Carlo simulations to probe aspects of analytical systems. One of the analysis I performed was interesting enough to publish in Clinical Chemistry. Here I performed simulations to get an idea of how assay variation affected cardiac clinical risk assessments. Using simple techniques, I uncovered some interesting features of the risk model referenced. For example, the influence of HDLC assay imprecision is more important than cholesterol and CRP imprecision in terms of risk assessment error. The paper is titled, Effect of analytical error on the assessment of cardiac risk by the high-sensitivity C-reactive protein and lipid screening model.
An important aspect of characterizing assay platforms is the evaluation of variation. This process can not only quantify performance, but it can also discover opportunities for refinement. Since analyzing for the purpose of knowing the ìtrueî variation associated with a system is often convoluted with special (outliers etc.) sources of variation, I developed a method to robustly determine variation. Here data is expressed in terms of cumulative probability. The slope of a line through a cumulative probability plot is an estimate of the variance. This process, based on S-PLUS functionality, performed better than the state of the art at the time. The details are described in U.S. Patent Application 20060241904, Determination of standard deviation. I believe, since the company was not in the business of licensing this type of technology to software companies, although the claims had been accepted, it decided not to pursue Patent status. I'm proud of this invention as teams of professional statisticians had worked on this for years and yours truly, a systems engineer and data driven scientist, discovered and developed a method which showed noticeable improvements in performance.
In the area of assay signal processing, one of my notable achievements was optimizing the spectroscopic parameters for the Valproic Acid assay. For the Synchron LX20 system, the final analytical reaction signal is calculated by taking data from 10 wavelengths and converting into a single signal versus time stream by multiplying several matrices. The matrices contained spectroscopic parameters for flash correction, interference correction, reagent outgassing, and extinction coefficients. The general process was published in an AACC poster session in 1992 (Pierre K, Metzler M, Dewberry T, Shui R, Vogt D, Levya P, Polychromatic correction for serum bilirubin, hemoglobin, and lipemia interferents on the Beckman Synchron CX7 system, Abstract. Clin Chem 1992; 38:1024). Additional math is then used to determine the final signal for assay calibration. I discovered using numerical optimization techniques to adjust the spectroscopic parameters can significantly improve performance. Although this was performed specifically for VPA for precision improvements, the notion is transcendent, i.e. the technique can be generally applied to optimize many aspects of assay performance using raw signal data e.g. reduce cross-reactivity and the effects of various interferents etc. Another cool thing about this type of work is that significant improvements in performance can be realized with minimal cost. In other words, adjusting software parameters is much cheaper than making hardware modifications i.e. cutting metal. I’m proud of this work in part because teams comprised of advanced level engineers, mathematicians, statisticians, and chemists did not discover this opportunity, yours truly did.
As we continue, I would be remiss if I did not include descriptions of some of my experiences (middle 2000’s) in graduate school at California State University at Fullerton. One of the great things about working for Beckman Coulter was the organization encouraged continuous improvement and education. Because of this, and the fact they paid for tuition and books, I was empowered to pursue graduate studies. My research advisor was Professor Scott Hewitt, a brilliant, passionate, committed educator and researcher. He was primarily interested in probing the kinetics of environmentally important chemical systems. My fellow students included Richard Frechen, David Robichaud, and Guy Dadson. More often than not I finished 2nd to these folks in the classroom. It was a delight to study with them. Although I started down the atmospheric chemistry research path, I eventually came up with a research project of my own. The Kinetics of Immunoprecipitation Reactions project was approved. Here absorbance measurements made at 10 wavelengths over time for samples with varying analyte concentrations were characterized using scattering theory and kinetics models. Unfortunately, as time progressed, I lost the eye of the tiger for this project. I was thriving at a full-time job and was also spending time with romantic concerns and some important life hobbies. Although I did not finish my Master's thesis, my experiences at CSF were fulfilling, rewarding and ultimately made me a better scientist. I finished my coursework (did well with the majority of A's) and was involved in some interesting science. Who knows, maybe one day I'll dust off that research project, finish it, and write it up.
Also, near this time in my career I was given the opportunity to teach mathematics. The Dean of Continuing Education at Los Angeles Trade Technical College, Richard Browne, offered me a part-time evening position teaching a SAT preparation skills course. Later I also taught a GED math preparation course. Aside from the extra money, I enjoyed this a great deal. This was interesting, challenging, and eye opening. I did not realize how hard teaching was. Although the course material was straight forward, offering the material to students in a real-time dynamic interesting manner is not trivial. Based on audits of my work, evidently, I was pretty good at it. In the end, I gained a renewed respect for the teaching profession.
Leadership phase 1:Later during my tenure at Beckman Coulter grooming for management started. Rather than being a purely technical contributor with minimal group leadership responsibility, I was tasked with leading cross functional teams. This was fun, interesting, and certainly useful for development of some soft skills.
I was the Project Manager (relatively small project) for the team that first applied a Cystatin C assay (Dako reagent) to the Immage system. This was done in collaboration with the scientists and business people from Dako. The project was completed on time.
I was the development team leader for the business critical high sensitivity CRP re-standardization project. Here I lead the development, contributed technically, and managed the transfer to production of the designed changes. We successfully removed some systematic bias from the Synchron LX20 assay that was significantly affecting field implementation of the product.
Over this time, as a result of continuous learning and leadership, I was promoted 3 levels from Associate Scientist to Staff Scientist. Of course, this could have never happened if it weren’t for the mentorship I received along the way. An impressive, comprehensive list of Beckman products can be found with this link; it was a privilege to be part of a few of them. My supervisor at the time I left Beckman Coulter was Sam Shammas. He was a wonderful supervisor, team leader, and manager.
In autumn of 2007, I was contacted by a former coworker Dr. Wajdi Abdul-Ahad about a new opening at his company he felt I was a good fit for, director system development. We had worked together in the past on a number of projects. To this day, I do not know anyone that knows more about immunoassay development than Wajdi. I interviewed and landed the job. I worked for Michael Poirier CEO during my entire stint at Qualigen. To say that Michael is not a micro manager is an understatement; he empowered employees to perform. During my 10 plus years at Qualigen I can count on 2 hands how many times I was issued a direct order to perform a task. Given my independent nature and improving leadership qualities, this was a good fit. When I left Qualigen my title was VP, System Development. Although I attended executive staff meetings later in my Qualigen career, my VP title is a bit deceiving as I never functioned in a corporate capacity. For example, I never worked on budgeting or business development concerns. My focus was on achieving goals that were technical in nature; I lead teams for this purpose and directly contributed to their successes.
Although at the highest level, the Beckman Coulter and Qualigen companies were the same as they both were operating in the In Vitro Diagnostics space, I immediately noted a large difference. In part, because of the organization’s relative sizes, Qualigen’s organizational agility was an order of magnitude greater than that of Beckman Coulter. Also, although I had some superficial exposure at BCI, I had never worked closely with heterogeneous immunoassay systems before. Learning this new technology made the job just that much more interesting from the onset. I hit the ground running upon arrival. As noted below, we were able to accomplish many tasks and complete projects quickly with minimal resources.
Leadership phase 2:Sometimes it’s preferred to listen and let the work happen. In other circumstances pushing the agenda is needed. In some instances, a hallway meeting is the best way to proceed, in other instances getting all of the troops together is required. Of course, there is an art to picking one's battles. To say that I have mastered all aspects of project leadership is a falsehood; I am still learning. With that being said, because we were committed, we solved many problems and made many improvements. Again, the data presented is not strictly chronological. For all of the items listed, I not only was a key technical contributor but also led implementation.
Current Business:When I arrived at Qualigen it was understood that I was an expert in value assignment and assay standardization. Consequently, over the years I lead the standardization and re-standardization of most of the Qualigen assays. FT4 was particularly interesting in this regard because of a FT4 solution formulation problem. In order to generate signal versus concentration curves in production, commutable reference materials are required. I discovered formulating FT4 standards and QC materials is particularly difficult because of the binding dynamics of this analyte with albumin. In order to perform this efficiently I had to develop a tool that characterized the stock solution used to spike that standards using a non-linear dilution vs. concentration profile. Because of the efficacy of this approach, it was generally adopted for the formulation of all calibrators, controls, and standards built at Qualigen.
Another interesting standardization project was conducted for the testosterone assay. Because the state of the art was evolving, it became important for Qualigen to agree with liquid chromatography tandem mass spectrometry (LC-MS-MS) methods. To achieve this, I conducted a patient correlation study in collaboration with Robert Fitzgerald at the VA hospital in San Diego. Using the data generated, reagent curve fitting standards were designed, built and value assigned such that the assay recovered the new LC-MS-MS method. However, during this process a problem emerged involving female specimen agreement. Most immunoassays, due to non-specific interferences, exhibited a positive bias for these samples. Because of this challenge, using traditional methods for assigning the internal standards at the low-end of the assay range did not work. To solve this problem, I developed a process by which the low-end standards signal data were adjusted during curve fitting. Unlike many immunoassays, as a result of this effort, the agreement between Qualigen testosterone and the reference method for all patients tested, including females, was excellent.
Also, early on I learned of an important problem that had befuddled current technical experts. Because of the importance of magnet face dimensions for the Fastpack product, a laser height gauge was purchased to assess magnet heights relative to a mounting face. Because the data from this sensor was quite noisy, folks had difficulty implementing this equipment. I developed a process for analyzing the gauge data using median filters along with derivative profiles, and wrote VBA base software to automate. The day was saved, instrument stop shipments were no longer an issue. Because of this accomplishment, I was offered an unscheduled pay raise of 10%. Rather than taking the raise, I opted for an additional week of vacation.
The primary reason I was hired originally was to perform the system verification for the first version of the next generation product. I came in the late prototype stage. The long and short of it is the instrument did not function properly. In addition, there was no whole blood separation module. Consequently, the project was shelved. Had it not been for delivering early on the magnet face problem, and the fact that management sensed that I was a high-level problem solver, I may have been “downsized”.
Prior to, and in conjunction with next generation development, I led continuous process improvement efforts for the company. Here we triaged the opportunities facing day to day business. The reagent consumables production line, the instrument production line, general operational efficiency, quality, customer feedback, cost reduction, part obsolescence, evolving state of the art concerns etc. were looked at frequently for the purpose of improving business robustness.
Early in my Qualigen career I noticed a huge opportunity within the production and development organizations. In order to perform the required day-to-day data analysis, results were manually transcribed from instrument screens to PC tools manually. In the interest of efficiency, I decided to pursue automating this process. Since the instruments sent data serially, I set up data collection modules to connect multiple instruments to PC’s through a serial to USB converter (Edgeport). Now that we had the ability to collect text files from multiple instruments into computers, I developed software to parse the text files into usable Excel files. The work hours saved over the years as a result of this implementation certainly range in the thousands.
Now that we had some automated data collection, it was time to take QC to the next level. I developed Excel based tools and lead implementation of results trending (Xbar and R charts) in reagent and instrument final test. Logic was also implemented in the production documentation for decision making based on the chart data. Based on this leadership, others lead implementation of similar process in the reagent fill line. Looking at trends, rather than snapshots, has many advantages, not the least of which is identifying process degradation that can be proactively addressed rather than reactively addressed based on customer feedback.
One of the important early continuous process improvement projects was in support of the total PSA assay. This was and is one of Qualigen’s bestselling assays. In this particular instance there was consternation in the field due to the assay impression in the upper portion of the assay’s analytical range. This fact was detected by customers because at least a couple of the proficiency samples (American Proficiency Institute) had concentrations within this range. I developed an alternative method, using optimization techniques, for handling the assay signal versus time data; this led to approximately a 25% reduction in total imprecision. Interestingly enough the reductions were realized in the instrument to instrument component of variation. This instrument variation was related to photomultiplier tube (PMT) variation. Prior to the transition described below, the Fastpack instrument used a photon counting type PMT. This sensor is excellent at reliably detecting low light level signals, but at high signal levels, due to sensor to sensor variation related to pulse coincidence, the efficacy degrades. The signal math I developed reduces the effect of this variation specifically.
A significant issue surfaced that caused a Fastpack instrument stop-shipment. To ensure proper instrument function, at the beginning of every assay, a PMT dark current assessment is performed. To pass this check, readings must be below a defined threshold. For no apparent reason, instruments being produced started failing this check. Engineer Wallace Ballentine and I systematically evaluated and discovered the source of the issue was the adhesive used to affix the door label. Unlike before, this material now emitted light when mechanically stimulated during test. We solved the problem by removing a portion of the label adhesive near the PMT window. Instrument production resumed.
Instrument reliability is always an important contributor to after-sales costs. To mitigate this risk, we designed and implemented an instrument burn-in process. Here the electromechanical processes are cycled multiple times over the course of days. Included are exercising pneumatics, the motor transmission assembly, and photomultiplier tube (PMT). Also, as part of this effort, I wrote an Excel VBA software application to automate the data analysis. The output reports included an assessment of pneumatics (individual valve function), graphs of motor motion versus time and PMT readings versus time (dark current assessment).
Filling reagent pouches for the Fastpack system arguably is the most important aspect of Qualigen operations as all revenue results from their sale. I studied this line for purposes of improving the robustness and noted at least 2 opportunities. The process was stopped in the middle of the filling day to check fluid volumes; I noted the process of stopping the line added more risk than the fill check mitigated. Once running, the machine is happiest if the fluids keep flowing. Also, because bubbles in the filling line presented a problem from time to time, I introduced the use of in-line debubblers (common in HPLC). Both refinements were implemented.
A significant challenge for the Fastpack instrument platform emerged in the form of parts obsolescence. This included the main CPU chip, the software storage (smart media card) and the PMT. I identified the replacement PMT and lead integration of the new light sensor. The old PMT was a photon counting type PMT while the new PMT had voltage output proportional to incident light intensity. Integration of this sensor was interesting as the new sensor had to be implemented such that it was equivalent to the old sensor i.e. from the users (internal and external customers) perspective the old and new instruments must use the same reagent consumable with the same calibration. To achieve this end, I developed a transformation model along which included a new PMT calibration method using a fixed light source. This model was a modification of the one defined in the Hamamatsu handbook. The process was successfully deployed.
Product shelf life is always important. Longer shelf lives allow customers to manage their consumption more easily. Short shelf lives put pressure on operations because the economy of scale gained from larger batches can be limited. Lastly, short shelf lives can have drastic effects on international shipments as larger shipments are required for these accounts and reagents often times spend time in central warehouses prior to distribution. When the stability of FT4 reagent became an issue, I reviewed the data from years of shelf-life testing and realized the reagent decay profiles had some consistent structure. Reagent decay nearly always occurs but, to the greatest extent, the decay is corrected by calibration. In this instance the linear decay assumption was limited i.e. the percentage decay in the signal versus time was not consistent across the measuring range. Fortunately, the extent of this non-linearity was consistent and could be corrected for mathematically. The correction model I developed was successfully implemented in instrument software leading to an increase in shelf life from 4 months to 8 months.
The TSH sample type project, for which I was project manager, was not particularly interesting technically as this was “just” a matter of adding serum to the list of allowable sample types. Based on this simple labeling change, we decided to perform a well replicated serum versus plasma study and submit this evidence to the FDA. They reviewed the data and decided that additional testing was required. The additional testing included linearity, precision, limit of quantitation, and interference/cross-reactivity studies. The moral of the story is; include contingencies in your schedules to allow for FDA submission iterations. Because the sample type addition was approved, customers have greater flexibility in their laboratory process.
Like the TSH project, the 25 uL tPSA project, was also not particularly interesting technically as this was “just” a matter of introducing a form of the assay to Europe with a sample size of 25 uL rather than the standard 100 uL. Based on this simple labeling change, we decided to perform linearity, precision, limit of quantitation, and hook-effect studies. The hook-effect performance (the purpose of the change) was significantly improved and the other analytical metrics met specifications. We successfully created a new tPSA kit for sale in Europe.
The use of CLSI protocols as the basis for clinical system performance evaluations is standard in the IVD industry. As such I created tools for performing Deming and Passing-Bablok regression, EP5 (precision), EP6 (linearity), and EP17 (Limit of Quantitation) data analysis. Although these analyses are relatively straightforward for those familiar, having well documented easy to use tools available for the non-expert definitely help facilitate analysis and user training.
Next Generation Development:During reprieves in current business activity, and because the use of whole blood is critical for a CLIA waived device, I decided to research whole blood separation methods. While surveying the web, I found Vivid blood separation membranes. I had a brief conversation with Wajdi on the subject when I learned that he had some samples in his filing cabinet. The research was off and running. With mechanical engineer Wallace Ballentine’s help (solid engineer and great guy), some mounts were designed to evaluate the material for its ability to separate plasma from whole blood by mechanically mating the membrane with various candidate plasma absorbent materials; it worked! Based on this, development of a quantitative plasma separation device was pursued. Early on, Diagnostics Consulting Network, helped optimize the geometry of the laminate. Later on, Duane Lawson was brought in to help optimize/finalize the design. In conjunction laminate design, the folks at Symbient helped with the laminate housing (sample port) design. Suzanne Poirier was also brought on to help with lab measurements. Ultimately, we ended up with a device that could deliver a quantitative volume of plasma which could be introduced into a reagent pouch. Two similar U.S. Patents 9664668, 9903799 (Whole blood analytic device and method therefor) cover this invention.
Now that we had separation device, in order to fulfill another aspect of CLIA waiver requirements, a method was needed to verify proper sample introduction. I decided to look at machine vision for this. With help from Matt Remnek at Cognex and engineer Joe Prizzi from Mindflow design, a test fixture was developed. Based on my detailed input, a method was developed to look at the change in grayscale (10 bit) over time and region of the sample collection pad. Once dialed in, the process proved to be a powerful and sensitive method for sample metering. U.S. Patent 20180342062, Systems and Methods for Sample Verification covers this invention.
We now had two critical modules defined, so a next step was to design the reagent pouch for the next generation analyzer. Wallace and I did this by taking the pouch design from the current Fastpack analyzer and modifying it to accommodate the sample collection pad/port. We also added reagent chambers for added assay format flexibility. U.S. Patent D830573, Reagent pack, covers this invention. The pack design provides the basis for significant aspects of the detailed next generation instrument design.
Up until this time, we worked on critical modules for the system and not the total system itself. Now, given full project possibilities, I began writing Requirements documents for the instrument and the consumable. The high-level documentation described the hardware and software user interfaces, engineering controls required for CLIA waiver, and key subsystem interactions. Of course, the devil is always in the details. The hardware portion included many considerations taken from the first-generation product along with the new plasma separation device and sample metering vision system aspects. What I am most proud of however, are the detailed software specifications. I designed all of the “under the hood” requirements for instrument operation from a software perspective which of course included detailed descriptions of the analytical processes including all signal handling from raw sensor data to final results.
Industrial, mechanical, and electrical engineering was started by Mindflow and later passed off to Qualigen to complete. In support of industrial and human factors engineering, I traveled with an engineer to a number of would be customer sites to study their labs and operations. The participants were also interviewed. Wallace played a key role in the ME for the instrument. He was also the primary designer of the equipment used to integrate the plasma separation device with the reagent pouch.
The software engineering was performed exclusively by Promenade software. I led weekly meetings where we tracked progress and refined requirements. Projects must view requirements as living documents and plan their revision as part of the development process. New details are always discovered while new and changing requirements emerge. Also, one very important feature of system development, especially with software, is ad-hoc testing. This is an excellent way to find bugs and discover ease-of-use opportunities. Managing software releases, requirements updates, and bug tracking was facilitated through the use of the Atlassian suite (Jira, Confluence, and Bitbucket) of tools.
Also, a partnership that evolved during the development process was a collaboration with Sekisui Diagnostics LLC. I had the privilege to work with key Sekisui team members Garrett Garner (technical leader) and Paul Halloran. Garrett and I worked closely with hands-on development of the system (Garrett contributed copious volumes of good work) while Paul also provided key leadership and valuable technical input.
To automate the generation of reagent lot specific signal versus concentration curve parameters, I wrote a VBA based software tool. The process is based on my detailed software requirements for results calculations.
Since public demonstrations of the instrument have been performed at the national AACC meeting, I have no problem sharing some poor cell phone pictures through this LINK. Shown are the instrument in standby, a test in process screen, patient result screen, calibration/QC result screen, and instrument self-test screen.
When I left Qualigen the instrument was in the process of Clinical Validations. This work was professionally conducted by Mark Sarno’s group. Unfortunately completing this portion fell short as the project funding ran out. We designed and developed a GREAT CLIA waivable instrument. This analytically sound system is easy to use and has a flexible assay application architecture.
Amprion is a cutting-edge biotechnology company with a CLIA certified and CAP accredited high complexity laboratory that focuses on neurodegenerative disease diagnosis and I am proud to have served on this team. This is a great company filled with talented people on a noble mission. Although I cannot mention many of my accomplishments at Amprion as most are part of active projects, I can say I had a tremendous experience. While at Amprion, the areas of my professional presence that expanded the most include:
As part of our efforts, we have developed and implemented a clinical laboratory test (SYNTap®) that measures the presence of misfolded alpha-synuclein which is the definitive biomarker for diagnosing synucleinopathies (Parkinson’s Disease, Dementia with Lewy Bodies (DLB), Multiple System Atrophy, and mixed pathology Alzheimer’s). This most certainly has moved the state of the art forward.
As evidence, scientific reporting in this area has grown significantly in recent years. For example, one paper from the Amprion clinical laboratory (Dr. Kendal Jensen and I are the primary contributors) highlights some assay features. Here, through the analysis diagnostic error rates as a function of cohort, accuracy of this assay is suggested. It was published in the Journal of Neurology and is titled “Seed Amplification Assay Results Illustrate Discrepancy in Parkinson’s Disease Clinical Diagnostic Accuracy and Error Rates”. Also, I contributed to another paper from our laboratory (Neurology) titled, “Association of CSF alpha-synuclein-SAA seed amplification results with clinical features of possible and probable dementia with Lewy bodies”. This paper, which involved the study of a DLB cohort, suggests the clinical diagnostic process would be improved through the use of the Amprion assay.
Lastly, I played a minor role (data collation and transfer) in developing this Alzheimer's & Dementia manuscript, Misfolded α-synuclein co-occurrence with Alzheimer’s disease proteinopathy.