Mixing studies have traditionally been used to differentiate factor deficiencies vs. inhibitors. To kick-start, we will provide a historical overview of how mixing studies were instrumental for the discovery of coagulation factors.
Although this test has been used for centuries, it lacks standardization due to variable sensitivity of the PT and APTT reagents for factor deficiencies as well as non-specific inhibitors (drugs and lupus anticoagulants (LA)).
We will discuss advantages and disadvantages of recommended methods to interpret mixing studies (correction into the reference range, percentage correction, the Rosner index, and the estimated factor correction). We will compare the accuracy of these cut-off methods by providing examples of single factor deficiency vs. conditions associated with multiple factor deficiency.
Lastly, we will discuss the use of mixing studies to detect inhibitors including LAs. We will then dive into the debate of incorporating mixing studies in LA algorithms and the need for standardization of LA interpretation guidelines will be illustrated with interesting cases.
Objectives
• Evaluate methods to determine correction on a 1:1 mix study
• Illustrate limitations of these cut-off methods for conditions involving multiple factor deficiencies
• Discuss the use of mixing study to interpret lupus anticoagulant testing