Open Access is an initiative that aims to make scientific research freely available to all. To date our community has made over 100 million downloads. It’s based on principles of collaboration, unobstructed discovery, and, most importantly, scientific progression. As PhD students, we found it difficult to access the research we needed, so we decided to create a new Open Access publisher that levels the playing field for scientists across the world. How? By making research easy to access, and puts the academic needs of the researchers before the business interests of publishers.

We are a community of more than 103,000 authors and editors from 3,291 institutions spanning 160 countries, including Nobel Prize winners and some of the world’s most-cited researchers. Publishing on IntechOpen allows authors to earn citations and find new collaborators, meaning more people see your work not only from your own field of study, but from other related fields too.

Brief introduction to this section that descibes Open Access especially from an IntechOpen perspective

Want to get in touch? Contact our London head office or media team here

Our team is growing all the time, so we’re always on the lookout for smart people who want to help us reshape the world of scientific publishing.

Home > Books > Quality Control and Quality Assurance - Techniques and Applications

Design of Experiments (DOE): Applications and Benefits in Quality Control and Assurance

Submitted: 09 June 2023 Reviewed: 24 November 2023 Published: 23 February 2024

DOI: 10.5772/intechopen.113987

Cite this chapter

There are two ways to cite this chapter:

From the Edited Volume

Quality Control and Quality Assurance - Techniques and Applications

Edited by Sayyad Zahid Qamar and Nasr Al-Hinai

To purchase hard copies of this book, please contact the representative in India: CBS Publishers & Distributors Pvt. Ltd. www.cbspd.com | [email protected]

Chapter metrics overview

185 Chapter Downloads

Impact of this chapter

Total Chapter Downloads on intechopen.com

IntechOpen

Total Chapter Views on intechopen.com

This chapter explores the applications and benefits of Design of Experiments (DOE) in the context of quality control and quality assurance. DOE is a statistical methodology that enables researchers and practitioners to systematically investigate and optimize processes, identify critical factors affecting quality, and reduce variability and waste. This chapter begins by introducing the overview and definitions of DOE, covering topics such as the history of DOE, types of DOE, steps involved in conducting DOE, and key components of DOE. The specific applications of DOE in quality control and quality assurance were explored, highlighting their importance across various industries. It demonstrates how DOE can be effectively applied to optimize products and processes, reduce defects and variation, improve quality, implement Six Sigma, and validate and verify processes. It then delves into the specific applications of DOE in quality control and assurance, highlighting its significance in various industries and sectors. Furthermore, the book addresses challenges and considerations in implementing DOE in real-world scenarios, such as resource constraints, experimental constraints, and data analysis complexities. It provides basic information on software tools commonly used in DOE.

  • design of experiments (DOE)
  • quality control
  • process variability
  • optimization

Author Information

Sheriff lamidi *.

  • Lagos State University of Science and Technology (Formerly Lagos State Polytechnic), Lagos, Nigeria

Rafiu Olalere

Adekunle yekinni, khairat adesina.

*Address all correspondence to: [email protected]

1. Introduction

Quality control and assurance are crucial aspects of any manufacturing or industrial process. Ensuring high-quality products and services is essential for customer satisfaction, brand reputation, and overall business success. Quality is a measure of the level of conformance of a product to design specifications or the ability of a product or service to satisfy user requirements. The duo of quality assurance and quality control helps deliver a defect-free product or service. Quality assurance focuses on preventing defects by ensuring the approaches, techniques, methods, and processes designed for the projects are implemented correctly. Quality control, on the other hand, focuses on identifying defects by ensuring that the approaches, techniques, methods, and processes designed in the project are followed correctly [ 1 ]. Quality assurance is process-oriented and a managerial tool, whereas quality control is product-oriented and a remedial tool, according to [ 1 ]. One powerful tool used in quality control and assurance is the design of experiments (DOE). According to engineers and technologists, they often make use of DOE methodologies for various applications ranging from the design of new products, improvement of design, maintenance, control and improvement of manufacturing processes, maintenance and repair of products, and several others [ 2 , 3 , 4 ]. This chapter aims to explore the applications and benefits of DOE in quality control and assurance. Design of experiments (DOE) is a statistical method for planning and conducting experiments. DOE is used to identify the factors that affect a process and to determine the optimal levels of those factors, as shown in Figure 1 . DOE can be used to improve the quality of products and processes, reduce costs, and increase efficiency [ 5 ]. Businesses and manufacturing companies can use Design of Experiments (DOE) in a variety of ways to differentiate themselves from the competition by constantly redesigning their products or creating new products to establish a presence in other markets. First, DOE can be used to identify the factors that most affect the quality of a product. By understanding which factors are most important, businesses can focus their efforts on improving those factors. This can lead to a product that is more reliable, durable, and user-friendly than the competition’s products. Second, DOE can be used to reduce the cost of manufacturing a product. By identifying the most efficient way to produce a product by optimizing manufacturing processes using DOE methodologies, businesses can improve the quality of their products and save money on labor, materials, and other costs [ 6 ]. This can lead to a lower price for the product, which can make it more competitive. Third, DOE can be used to develop new products that meet the needs of a specific market. By understanding the needs of the target market, businesses can develop products that are more likely to be successful. This can help businesses gain a foothold in new markets and increase their market share.

design of experiments review

Design of Experiment (DOE).

1.1 Objective of the chapter

The objective of this chapter is to provide an in-depth understanding of DOE and its applications in quality control and assurance. We will explore various experimental designs, statistical techniques, and methodologies that are commonly used in DOE. Additionally, we will discuss its practical applications across various fields and the benefits and advantages that DOE offers in ensuring and improving quality standards.

1.2 An overview and definitions of design of experiments (DOE)

Definitions of Design of Experiments (DOE): DOE is a statistical methodology used to systematically plan, conduct, analyze, and interpret experiments to obtain valid and reliable results. It allows researchers to efficiently explore and identify the significant factors influencing a process or product’s performance. DOE is an important statistical method used in controlling input factors or variables in order to ascertain the level of relationships with the output (responses) according to Figure 1 , so as to ensure product or process quality. DOEs are usually carried out in five stages [ 7 , 8 ] as shown in Figure 2 . They are:

design of experiments review

Five stages of DOE.

1.3 History of DOE

DOE has its roots in the work of Sir Ronald Fisher, who developed the basic principles of DOE in the early twentieth century. Fisher’s work was initially applied to agricultural research, but it was soon adapted for use in other fields, including manufacturing, engineering, and medicine [ 9 ]. Since then, many scientists and statisticians have contributed to DOE development and its application in different fields [ 9 , 10 , 11 , 12 ].

1.4 Types of DOE

There are many different types of DOE, each with its own strengths and weaknesses [ 8 , 13 ]. The best type of DOE to use will depend on the specific situation. Factors to consider include the number of factors, the number of levels for each factor, the desired level of confidence, and the time and budget constraints ( Figure 3 ) [ 10 , 14 , 15 , 16 ].

design of experiments review

Types of DOE.

The most common types of DOE are:

1.4.1 Full factorial designs

These designs involve testing all possible combinations of factors. For example, if there are two factors with two levels each, there would be four possible combinations (2 × 2 = 4). Full factorial designs are the most comprehensive, but they can also be the most time-consuming and expensive.

1.4.2 Screening/fractional factorial designs

Fractional factorial experiments are a type of factorial experiment that uses fewer experimental runs than a full factorial design. These designs involve testing a subset of the possible combinations of factors. Fractional factorial designs are less comprehensive than full factorial designs, but they can save time and money.

1.4.3 Response surface methodology (RSM) designs

RSM is a type of DOE that is used to fit a mathematical model to the response variable. RSM can be used to identify the optimal levels of the factors and predict the response variable for new combinations of factors. These designs are used to study the relationship between a response variable and multiple factors.

1.4.4 Mixture designs

These designs are used to study the relationship between a response variable and multiple factors that are mixed together. Mixture designs are often used in the food and beverage industry to understand how the different ingredients in a product affect the taste, texture, and other properties of the product.

1.4.5 Taguchi designs

These designs are a type of fractional factorial design that is specifically designed for quality improvement. Taguchi designs are often used in manufacturing, where it is important to produce products that meet the required quality standards.

1.5 Steps involved in conducting DOE

Define clear objectives

Select Process variable

A feasible experimental design must be selected.

Execute the selected design.

Ensure that the data are consistent with the experimental assumptions.

Analyze and interpret the results.

Results presentation, and application for decision making.

Conclusions

design of experiments review

Steps required in DOE.

In the application of the concept of DOE methodology for quality control and assurance, the following terminologies, otherwise known as components of DOE, are commonly used:

2. Key concepts/components of DOE

Factors: Variables that may influence the outcome of an experiment.

Levels: The values at which factors are set during an experiment.

Response Variable: The outcome or output variable that is measured or observed.

Experimental Units: The entities or subjects on which the experiments are conducted.

Treatment: The combination of factor levels applied to an experimental unit.

Replication: The process of repeating the experiment to reduce variability and enhance reliability.

3. Applications of DOE in quality control and assurance

3.1 product and process optimization.

DOE enables the systematic exploration of various factors and their interactions to optimize product and process performance. By identifying the key factors and their optimal levels, manufacturers can improve quality, reduce costs, and enhance efficiency.

3.2 Defects and variation reduction DOE

Helps identify the root causes of defects and variations in a manufacturing process. By conducting experiments and analyzing the results, quality engineers can pinpoint the factors that contribute to defects and develop strategies to reduce or eliminate them.

3.3 Quality improvement and six sigma

DOE is an integral part of Six Sigma methodologies, which aim to achieve process excellence and reduce variation. By using DOE, organizations can identify critical process parameters, set optimal levels, and implement strategies to minimize defects and variations, thus improving overall quality.

3.4 Process validation and verification

DOE plays a crucial role in the validation and verification of manufacturing processes. By conducting designed experiments, organizations can gather data on process performance, determine critical process parameters, and establish robustness and reliability of their processes.

4. Key factors affecting quality optimization processes and waste reduction

Process Design and Standardization: Well-designed processes with clear specifications and standard operating procedures (SOPs) play a vital role in optimizing quality and minimizing waste. Factors such as process layout, equipment selection, workflow efficiency, and error-proofing mechanisms can significantly impact the quality of output and waste generation [ 17 , 18 ].

Quality Control and Monitoring: Effective quality control measures, including robust inspection protocols, real-time monitoring systems, and statistical process control (SPC) techniques, help identify and rectify quality issues promptly. Monitoring critical process parameters and implementing quality control checks at various stages can minimize defects and waste [ 12 , 19 ].

Training and Skill Development: Well-trained and skilled personnel are essential for maintaining quality standards and reducing waste. Adequate training programs that emphasize quality awareness, technical skills, and problem-solving capabilities contribute to consistent quality optimization and waste reduction [ 20 ].

Continuous Improvement and Lean Practices: Embracing continuous improvement methodologies, such as Lean Six Sigma, can drive quality optimization and waste reduction. Tools like value stream mapping, root cause analysis, and Kaizen events enable organizations to identify and eliminate process inefficiencies, defects, and non-value-added activities [ 21 ].

5. Software tools commonly used in DOE

Minitab: Minitab is a popular statistical software package widely used for DOE. It offers a comprehensive set of DOE tools, including factorial designs, response surface methods, and mixture designs. Minitab provides easy-to-use graphical and statistical analysis features, making it suitable for both beginners and experienced users.

JMP: JMP is a powerful statistical software developed by SAS. It offers a range of DOE techniques, such as factorial designs, response surface methods, and mixture designs. JMP provides an interactive interface with drag-and-drop capabilities for designing experiments, analyzing data, and visualizing results.

Design-Expert: Design-Expert is a specialized software tool specifically designed for DOE. It offers a wide range of experimental design options, including factorial designs, response surface methods, mixture designs, and Taguchi designs. Design-Expert provides advanced graphical and statistical analysis features to facilitate the optimization of processes and product formulations.

R: R is a popular open-source programming language for statistical computing and graphics. It has a rich collection of packages that support DOE, such as the ‘DOE’ package and ‘rsm’ package. R provides extensive flexibility and customization options for designing experiments, analyzing data, and performing advanced statistical modeling.

Excel: Microsoft Excel, though not specifically designed for DOE, can be used for simple experimental designs and analysis. It offers basic statistical functions, charts, and data analysis tools that can be utilized for conducting DOE experiments and analyzing results.

6. Some examples of specific fields where DOE found its practical application

6.1 manufacturing industry.

DOE can be used to optimize the process of manufacturing a part, identify the root cause of a quality problem, or reduce the variability of a process, which is a measure of quality. It can be used to identify the causes of defects in a product or to find ways to reduce the time it takes to manufacture a product. DOE can be adopted in the manufacturing industry by an industry that desires to manufacture a machine part (from Al-Si alloy material) with minimum surface roughness by combining three controllable variables (cutting speed, feed rate, and depth of cut). Due to the combinations of many variables, the Design of Experiments (DOE) could be used to carry out the study of the effect of the three machining variables (cutting speed, feed rate, and depth of cut) on the surface roughness (Ra) of Al-Si alloy [ 22 , 23 ].

In the automotive industry, DOE is used to improve the fuel efficiency of cars and trucks. By identifying the factors that most affect fuel efficiency, businesses can design cars and trucks that use less fuel. This can lead to lower emissions and a reduced cost of ownership for the customer.

In the food industry, DOE is used to improve the taste and texture of food products. By understanding how different factors affect the taste and texture of food, businesses can develop products that are more appealing to consumers. This can lead to increased sales and a stronger brand reputation.

In the pharmaceutical industry, DOE is used to develop new drugs that are more effective and less harmful than existing drugs. By understanding how different factors affect the effectiveness and toxicity of drugs, businesses can develop drugs that are more likely to be approved by the agency in charge. This can lead to increased profits and to a better quality of life for patients.

6.2 Engineering

DOE can be used to optimize the design of a car engine or to improve the yield of a chemical reaction [ 10 ]. DOE can be used to improve the design of products and processes, reduce costs, and increase efficiency. Furthermore, DOE can be used to optimize the design of a bridge, identify the root cause of a failure, or reduce the weight of a product.

DOE can be used to optimize the design of composite materials for specific structural applications. Factors such as fiber type, fiber volume fraction, resin content, and curing parameters can be varied systematically to achieve desired mechanical properties such as strength, stiffness, and impact resistance [ 24 ].

DOE is used to optimize process parameters to improve yield and quality in semi-conductor engineering. Factors such as temperature, pressure, etching time, and gas flow rates can be varied systematically to identify the optimal settings that result in minimal defects and enhanced performance [ 25 ].

6.3 Medicine

DOE can be used to optimize drug formulations to improve the bioavailability of pharmaceutical products. Factors such as excipient composition, drug concentration, and manufacturing parameters can be systematically varied to identify the optimal combination that maximizes drug absorption and efficacy [ 26 ].

DOE can be used to optimize the dosage of a drug, to identify the side effects of a treatment, or to reduce the risk of a disease. DOE is used in medicine to improve the effectiveness of treatments and to reduce the side effects of drugs. For example, DOE can be used to identify the best dose of a drug to treat a particular condition or to find ways to reduce the toxicity of a drug.

DOE can be utilized to optimize treatment protocols in radiation therapy for cancer patients. Factors such as radiation dose, treatment duration, and beam angles can be systematically varied to identify the optimal combination that maximizes tumor control while minimizing side effects on healthy tissues [ 24 , 27 ].

6.4 Agriculture

DOE can be used to evaluate the effects of fertilizer formulations on crop yield. In the work of [ 17 ], DOE was used to investigate the impact of different fertilizer formulations on crop yield. The study involved varying factors such as nitrogen, phosphorus, and potassium concentrations in the fertilizer mix. By systematically designing and conducting experiments, they were able to determine the optimal combination of nutrients that maximized crop yield while minimizing the amount of fertilizer required.

DOE can be used to assess the impact of irrigation techniques on water use efficiency. The effect of different irrigation techniques on water use efficiency in crop production was evaluated by [ 28 ]. Various factors, such as irrigation frequency, irrigation duration, and water application rate, were manipulated and studied. The experiments allowed the researchers to identify the optimal combination of irrigation practices that resulted in improved water use efficiency without compromising crop yield [ 28 ].

Optimization of Plant growth conditions in controlled environments was conducted by [ 29 ] using DOE. The growth conditions of the plant was optimize in controlled environments such as greenhouses or growth chambers. Factors such as light intensity, temperature, humidity, and CO 2 levels were systematically varied to determine the optimal combination that promoted plant growth, development, and yield.

6.5 Marketing

DOE can be used to optimize product packaging design elements such as color, shape, size, and labeling to understand their impact on consumer perception and purchase behavior. By systematically varying these factors, marketers can identify the optimal packaging design that maximizes consumer appeal and product sales [ 30 ].

DOE can be employed to evaluate different pricing strategies and their impact on consumer behavior, purchase intent, and profitability. Factors such as price levels, discount offers, and promotional strategies can be systematically varied to determine the optimal pricing strategy that maximizes sales and profitability [ 31 ,  32 ].

DOE can be utilized to test and optimize various elements of advertisements, such as visual design, headline, copywriting, and call-to-action. By systematically varying these factors and measuring consumer responses, marketers can identify the optimal combination that maximizes advertisement effectiveness and consumer engagement [ 33 ].

7. Benefits of DOE in quality control and assurance

Efficient Resource Utilization: DOE allows organizations to allocate their resources efficiently by identifying the most influential factors. By focusing on these factors, companies can optimize their processes and achieve significant improvements in quality without unnecessary expenditure.

Cost Reduction: By systematically exploring process factors and their interactions, DOE helps identify cost-effective solutions. By reducing defects, eliminating waste, and optimizing process parameters, organizations can save costs associated with rework, scrap, and material consumption.

Improved quality: DOE can be used to improve the quality of products and processes by identifying and reducing the variability of the process.

Reduced costs: DOE can be used to reduce costs by identifying the most efficient way to produce a product or by reducing the amount of waste.

Increased efficiency: DOE can be used to increase efficiency by identifying the root cause of problems and by improving the design of products and processes.

Enhanced Decision Making: DOE provides a structured approach to experimentation, resulting in reliable and statistically valid data. This enables informed decision making based on evidence rather than intuition or guesswork. By using DOE, organizations can make data-driven decisions to improve quality and minimize risks.

Faster Time to Market: DOE facilitates the identification of critical process parameters and optimal levels, leading to faster process optimization. By reducing the time required for experimentation and process development, organizations can accelerate product development cycles and bring products to market more quickly.

8. Challenges and considerations in implementing DOE in real-world scenarios

Resource Constraints: Limited resources such as time, budget, and availability of equipment or materials can pose challenges in implementing DOE. Conducting experiments may require significant time and financial investments. It is essential to carefully plan and allocate resources to ensure the feasibility and success of DOE studies [ 19 , 34 ].

Experimental Constraints: Some experiments may face practical constraints due to factors such as safety regulations, ethical considerations, or limitations in the process or system under investigation. Researchers must identify and address these constraints to design experiments that are feasible and align with regulatory requirements [ 21 , 35 ].

Data Analysis Complexities: Analyzing experimental data and interpreting the results can be challenging, particularly when dealing with complex designs or large datasets. Specialized statistical knowledge may be required to properly analyze and draw meaningful conclusions from the data obtained through DOE. Consideration should be given to the appropriate statistical methods and software tools for analyzing the experimental results [ 35 , 36 ].

Planning for Interactions and Confounding: Identifying and addressing potential interactions among factors and confounding effects can be complex in DOE. Interactions and confounding can affect the interpretation of experimental results and lead to incorrect conclusions. Careful consideration and appropriate experimental design strategies, such as fractional factorial designs, can help mitigate these challenges [ 16 , 37 , 38 ].

9. Conclusion

In this chapter, the authors focus on the applications and benefits of DOE in quality control and assurance. DOE is a very important statistical methodology that enables both scientists, engineers, researchers, and various other professionals to design, develop, and optimize high-quality products and services. DOE has many applications in various fields. Application of DOE ensures high-quality products and services, increases customer satisfaction, efficient resource utilization, cost reduction, enhanced decision making, improves brand reputation, and ensures overall business success. Having presented in this chapter the numerous applications of DOE in various fields, it can be concluded that DOE is a powerful research tool and methodology for quality control and assurance.

Acknowledgments

The opportunity to have this chapter review included in the main book, Quality Control and Quality Assurance - Techniques and Applications, is greatly appreciated by the authors. We also commend the rigorous peer review process.

  • 1. Kiran Kumar Panigrahi. 2023. Available from: https://www.tutorialspoint.com/differences-between-quality-assurance-and-quality-control
  • 2. Tanco M, Viles E, Ilzarbe L, Álvarez MJ. Manufacturing industries need Design of Experiments (DoE). In: Proceedings of the World Congress on Engineering. London, UK: Newswood Limited; 2007
  • 3. Smith J, Johnson A. Application of design of experiments (DOE) methodologies in engineering and technology. Journal of Engineering and Technology Applications. 2020; 15 (2):45-62
  • 4. Taguchi G. System of Experimental Design. New York: Wiley; 1987
  • 5. Dahlgaard JJ, Dahlgaard-Park SM, Boesgaard KM. Enablers of waste management in Danish companies: Training, commitment and motivation. Journal of Cleaner Production. 2007; 15 (18):1765-1775
  • 6. Gutiérres PH, De la Vara SR. Análisis y diseño de experimentos. Journal of Chemical Information and Modeling. 2012. DOI: 10.1017/CBO9781107415324.004
  • 7. Wu CFJ. Design of Experiments: Theory and Practice. Hoboken, NJ: Wiley; 2006
  • 8. McCool J. Using the Weibull Distribution: Reliability Modelling, and Interference. New Jersey: John Wily and Sons; 2012
  • 9. Madadian Bozorg N, Leclercq M, Lescot T, Bazin M, Gaudreault N, Dikpati A, et al. Design of experiment and machine learning inform on the 3D printing of hydrogels for biomedical applications. Biomaterials Advances. 2023; 153 :213533. DOI: 10.1016/j.bioadv.2023.213533
  • 10. Ilzarbe L, Álvarez MJ, Viles E, Tanco M. Practical applications of design of experiments in the field of engineering: A bibliographical review. Quality and Reliability Engineering. 2008; 24 (4):417-428. DOI: 10.1002/gre.909
  • 11. Assia C. Designer’s Guide to Lab Practice. 1st ed. London: Routledge; 2023
  • 12. Ebadi M, Mozdianfard MR, Aliabadi M. Employing full factorial design and response surface methodology for optimizing direct contact membrane distillation operational conditions in desalinating the rejected stream of a reverse OSMOSIS unit at Esfahan refinery. Water supply Journal. 2018; 19 (2):492-501. DOI: 10.2166/ws.2018.094
  • 13. Myers RH, Montgomery DC. Response Surface Methodology: Process and Product Optimization Using Designed Experiments. United Kingdom: John Wiley & Sons; 2016
  • 14. Michael Sadowski. 2022. Available from: https://www.synthace.com/blog/types-of-doe-design-a-users-guide
  • 15. Terry WA. Introduction to Experimental Methods. 1st ed. Boca Raton: Taylor and Francis Group; 2023
  • 16. Khavekar RS, Hari VS. Analyzing the need for a comparative study of Shainin DoE and traditional DoE tools for deploying six sigma in Indian manufacturing companies. IOP Conference Series: Materials Science and Engineering. 2018; 376 :012121
  • 17. Zhang Y, Yu Z, Wang J, Li Y, He X. Optimization of fertilizer formulations for sustainable agriculture using design of experiments and response surface methodology. Journal of Plant Nutrition. 2018; 41 (6):713-725
  • 18. Yang CH, Huang JT. Process optimization and layout design for electronics manufacturing service industry. Mathematical Problems in Engineering. 2020; 2020 :1-13
  • 19. Montgomery DC. Design and Analysis of Experiments. 8th ed. Hoboken, NJ: Wiley; 2013
  • 20. Antony J, Banuelas R. Key ingredients for the effective implementation of six sigma program. Measuring Business Excellence. 2002; 6 (4):20-27
  • 21. Anderson MJ, Whitcomb PJ, Montgomery DC. Design of experiments in quality engineering. In: Encyclopedia of Statistics in Quality and Reliability. 2nd ed. New York City, USA: John Wiley & Sons; 2017. pp. 1-15
  • 22. Kasali Aderinmoye A, Sheriff BL. Effect of dry cutting system on surface finish in turning operation of Al-Si alloy. Journal of Multidisciplinary Engineering Science and Technology. 2021; 8 :10
  • 23. Lamidi S, Olaleye N, Bankole Y, Obalola A, Aribike E, Adigun I. Applications of response surface methodology (RSM) in product design, development, and process optimization. In: Response Surface Methodology - Research Advances and Applications. London, UK: IntechOpen; 2023. DOI: 10.5772/intechopen.106763
  • 24. Wang Y, Lu H, Zhu Q , Tang Y. Design optimization of composite materials using DOE for stiffness. Polymer Testing. 2019; 80 :106119
  • 25. Lee Y, Kim C, Park S. Optimization of the etching process parameters in plasma etching of Al 2 O 3 using a design of experiment. Journal of the Korean Physical Society. 2020; 77 (3):219-224
  • 26. El-Helaly SN, Mostafa SA, Hussein AK. Optimization of theophylline oral disintegrating tablets formulation using design of experiments for enhanced bioavailability. Drug Development and Industrial Pharmacy. 2020; 46 (1):33-43
  • 27. Vicente V, Zerón H, Martínez Á, Palomo A, Garayoa J. Optimization of treatment protocols in radiation therapy using design of experiments: A review. International Journal of Radiation Oncology *Biology* Physics. 2017; 99 (4):839-846
  • 28. Rashid A, Aslam M, Azizullah A, Shah M. Enhancing water use efficiency in wheat crop using different irrigation techniques. Environmental Monitoring and Assessment. 2019; 191 (1):9
  • 29. Körner O, Challa H. Plant phenotyping using DOE and machine learning algorithms in controlled environments. In: Proceedings of the 10th International Conference on Computer Vision Theory and Applications (VISAPP). Vol. 1. Lisbon Portugal: SciTePress; 2017. pp. 547-554
  • 30. Hoegg J, Hofmann J, Meckel K. Packaging aesthetics and consumer preferences: A cross-cultural study using experimental auctions and choice tasks. Journal of Marketing. 2017; 81 (1):86-101
  • 31. Ariely D, Wertenbroch K. Procrastination, deadlines, and performance: Self-control by precommitment. Psychological Science. 2002; 13 (3):219-224
  • 32. Box GEP, Hunter WG, Hunter JS. Statistics for Experimenters: Design, Innovation, and Discovery. New York, USA: Wiley; 2005
  • 33. Liu X, Dong Y. Designing optimal advertising messages for different cultures: An experimental study. International Journal of Advertising. 2018; 37 (4):588-610
  • 34. Montgomery DC. Design and Analysis of Experiments. Arizona, USA: John Wiley & Sons; 2017
  • 35. Hicks CR, Turner KV. Fundamental Concepts in the Design of Experiments. United Kingdom: Oxford University Press; 1999
  • 36. Roberto F, Alberto M, Luca P, Luigi S. Design of experiments and machine learning with application to industrial experiments. Stat Papers. 2023; 64 :1251-1274. DOI: 10.1007/s00362-023-01437-w
  • 37. Box GEP, Hunter WG, Hunter JS. Statistics for Experimenters: Design, Innovation, and Discovery. 2nd ed. Hoboken, NJ: Wiley; 2005
  • 38. Grant EL, Leavenworth RS. Statistical Q uality Control. New York City, USA: McGraw-Hill Education; 2013

© 2023 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution 3.0 License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Continue reading from the same book

Quality control and quality assurance.

Edited by Prof Dr Sayyad Zahid Qamar

Published: 29 May 2024

By Vladimir Shper, Elena Khunuzidi, Svetlana Sheremet...

85 downloads

By Barbara Ciecińska

69 downloads

By Elham Said Hasham and Anthony Said Hasham

IntechOpen Author/Editor? To get your discount, log in .

Discounts available on purchase of multiple copies. View rates

Local taxes (VAT) are calculated in later steps, if applicable.

Support: [email protected]

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 06 August 2019

A Design of Experiments (DoE) Approach Accelerates the Optimization of Copper-Mediated 18 F-Fluorination Reactions of Arylstannanes

  • Gregory D. Bowden   ORCID: orcid.org/0000-0003-2274-6738 1 ,
  • Bernd J. Pichler 1 , 2 &
  • Andreas Maurer   ORCID: orcid.org/0000-0003-2412-5361 1 , 2  

Scientific Reports volume  9 , Article number:  11370 ( 2019 ) Cite this article

25k Accesses

102 Citations

4 Altmetric

Metrics details

  • Drug development
  • Nuclear chemistry

Recent advancements in 18 F radiochemistry, such as the advent of copper-mediated radiofluorination (CMRF) chemistry, have provided unprecedented access to novel chemically diverse PET probes; however, these multicomponent reactions have come with a new set of complex optimization problems. Design of experiments (DoE) is a statistical approach to process optimization that is used across a variety of industries. It possesses a number of advantages over the traditionally employed “one variable at a time” (OVAT) approach, such as increased experimental efficiency as well as an ability to resolve factor interactions and provide detailed maps of a process’s behavior. Here we demonstrate the utility of DoE to the development and optimization of new radiochemical methodologies and novel PET tracer synthesis. Using DoE to construct experimentally efficient factor screening and optimization studies, we were able to identify critical factors and model their behavior with more than two-fold greater experimental efficiency than the traditional OVAT approach. Additionally, the use of DoE allowed us to glean new insights into the behavior of the CMRF of a number of arylstannane precursors. This information has guided our decision-making efforts while developing efficient reaction conditions that suit the unique process requirements of 18 F PET tracer synthesis.

Similar content being viewed by others

design of experiments review

Bayesian reaction optimization as a tool for chemical synthesis

design of experiments review

Anthropogenic biases in chemical reaction data hinder exploratory inorganic synthesis

design of experiments review

Reducing uncertainties in greenhouse gas emissions from chemical production

Introduction.

Positron emission tomography (PET) has become an important imaging technique that is used routinely in clinical practice and as a powerful biomedical research tool 1 . PET, as with other nuclear imaging modalities, relies on the appropriate use of well-designed radiotracers, molecules that are labelled with a positron emitting radionuclide and are designed to target and accumulate in specific organs, cells, diseased tissues, and/or biochemical pathways, providing physiological and molecular information about the subject 2 . The accessible and flexible design and radiosynthesis of novel tracers is a cornerstone of PET imaging as a preclinical research technique and the efficient and scalable development and production of new PET tracers is vital to the advancement of PET imaging as a clinically relevant tool 3 .

Of the many radioisotopes that can be readily produced with small medical cyclotrons, 18 F has become particularly popular for medical imaging due to its almost ideal nuclear properties. Its decay mode (97% by positron emission), short positron range in tissue, high specific activity and practical 110-minute half-life have made it an attractive isotope for both clinical PET imaging and preclinical research and development 2 , 4 . However, in large part due to fluoride’s large hydration energy, basicity, and weak nucleophilicity, late stage radiofluorinations are synthetically challenging 5 . These reactions have, up until recently, been restricted to a relatively small subset of nucleophilic substitution reactions on aliphatic carbons (S n 2) or electron-deficient aromatic rings (S n Ar). The limited number of synthetic tools available to radiochemists has in turn restricted the diversity and accessibility of new 18 F radiotracers and has hence hindered their development 3 . Additionally, most clinical and pre-clinical radiosyntheses need to be carefully designed so that they can be performed in automated synthesis modules, which adds an additional layer of complexity when developing scalable and clinically relevant 18 F tracer syntheses 2 .

Recently however, new 18 F labeling methodologies have been published that have begun to push the field forward, opening new avenues for radiotracer design and synthesis 2 , 5 , 6 , 7 , 8 . Seminal works published by the groups of Sanford, Gouverneur and Scott have provided unprecedented new synthetic tools for the late-stage radiolabeling of electron-rich and -neutral aromatic rings through the copper-mediated radiofluorinations (CMRF) of aryl boronic acids, aryl boronic esters and arylstannanes (Figure  1 )  9 , 10 , 11 , 12 , 13 . These reactions have been demonstrated through the synthesis of a number of clinically relevant tracers, and a number of groups, including our own, have begun to adopt these methodologies for the development of novel PET tracers 14 , 15 .

figure 1

Recent copper-mediated nucleophilic radiofluorinations of electron-rich and electron-neutral ( a ) arylboronic esters by Tredwell et al ., ( b ) arylboronic acids by Mossine et al ., and ( c ) arylstannane precursors by Makaravage et al . 9 , 10 , 11 .

figure 2

( a ) The OVAT approach resolves reaction space one dimension at a time. The DoE approach builds a matrix of experimental runs to model a response surface across all reaction space. Different design types allow for ( b ) efficient factor screening studies or ( c ) more focused and detailed response surface optimization studies. The color grading represents the value of the true response (blue low, red high.) This figure has been recreated and modified with permission from the catalysisconsulting.co.uk website 35 .

However, many of these reactions have suffered from poor reproducibility and synthesis performance at larger scales. Works by a number of groups have identified the processing of the 18 F (through QMA cartridge elution and azeotropic drying) as a critical step as the copper mediator is particularly sensitive the strong bases present in standard QMA eluents 14 , 15 . A number of efficient protocols to improve 18 F QMA recovery rates and reaction conversions have thus been developed 16 , 17 , 18 , 19 , 20 . These general “unified” conditions, which include the popular “minimalist” 18 F processing approach, have allowed CMRF chemistry to become a more frequently utilized tool for novel tracer development. However, in addition to the 18 F processing method, CMRF reactions are themselves also complex, multicomponent processes and thus require the optimization of multiple nuanced, non-linear, and (as we will show) precursor specific experimental factors. The synthesis of almost every novel tracer (with the goal of automation) must undergo an extensive optimization process with respect to the reaction conditions, especially where new methodologies are utilized. This remains a crucial yet difficult, expensive, and often rate limiting step in the tracer/synthesis development pipeline.

The uptake of new radiochemical methodologies into routine use is heavily dependent on the reaction’s optimized operational simplicity, scalability, reliability, and efficiency in terms of both radiochemical conversion (%RCC) and byproduct formation (radiochemical purity and specific activity) 3 . Traditionally, these methodologies are optimized through the “one variable at a time” (OVAT) approach, which aims to hold all reaction variables ( X i ) constant while one is adjusted until a maximum %RCC or isolated radiochemical yield (%RCY) (response, Y i ) is observed. This process is repeated until all factors suspected of effecting the response of interest have been optimized one by one (Fig.  2a )  21 . This procedure is simple but laborious and time consuming, requiring many individual runs across an often-large number of parameters, many of which may have no significant contribution to the response. As this approach only looks at one factor at a time, it is unable to detect factor interactions, where the setting of one factor may affect the influence of another, and thus it often provides results that are difficult to interpret 22 . Additionally, the results of an OVAT study are dependent on the starting settings of the optimization process and as such, OVAT is prone to finding only local optima and may thus miss the true set of optimal conditions 23 .

An alternative to the OVAT approach is factorial experimental design or “Design of Experiments” (DoE), a systematic and statistical approach to process optimization that has been widely used by process engineers and chemists across a multitude of industries 24 . Unlike OVAT, DoE aims to explore, map and model the behavior of the response (or multiple responses) within a given reaction space (the combined ranges of all factors involved) across multiple factors simultaneously by varying all variables at once according to a predefined experimental matrix (Fig.  2b,c ). DoE is thus able to provide a more detailed picture of the behavior of a particular process with experimental efficiency and is able to determine the contribution of each factor to the system, model the effect of each factor on the response, and resolve factor interactions. Even with low-resolution factor screening designs, where multiple factors maybe confounded, DoE aids in decision making and in the planning of further optimization studies 21 , 25 . As DoE data is analyzed statistically across a whole study (using multiple linear regression (MLR)) the error throughout the regression model can be estimated without the need for the multitude of replicate experiments (with the exception of replicate centerpoint experiments which are used to calculate the pure error) typically performed in OVAT studies, further increasing the experimental efficiency of the approach. Furthermore, the advent of user-friendly software packages, such as Modde and JMP , has helped to lower the barrier of entry of DoE for researchers with basic experience in statistical analysis 26 . In addition to those cited here, the numerous practical and scientific advantages of the DoE approach have been well outlined in a number of excellent reviews 21 , 22 , 23 , 24 , 25 , 26 , 27 .

DoE studies are usually conducted in sequential phases to answer specific scientific questions and there are a large number of different DoE designs that can be used in various situations to maximize the amount and quality of information obtained from the lowest number of experimental runs 25 , 27 . Typically, a DoE optimization will begin with a low resolution (highly confounded) fractional factorial screening design (Sup. Fig.  1a,b ) in order to screen a large number of continuous (temperature, reagent stoichiometry, concentration, time, etc.) or discrete (atmosphere, solvent, reagent identity, etc.) variables that may affect the investigated response (%RCC, specific activity (SA), etc.) These “factor screening” (FS) experiments are designed to ascertain which factors have the largest influence on the response, give limited information on the presence of factor interactions and eliminate non-significant factors in as few runs as possible. They are thus usually not detailed enough to provide an accurate, predictive model of the system in question. Once the significant factors are identified, higher resolution response surface optimization (RSO) studies with a reduced subset of experimental factors can be constructed and performed if necessary (Sup. Fig.  1c,d ). These designs usually contain more experimental points (per factor) and are intended to produce a detailed mathematical model of the process’s behavior.

DoE has been previously demonstrated as powerful tool for exploring and understanding new radiochemical methodologies 28 , 29 . In the context of copper-mediated radiosynthesis, DoE may provide a practical and efficient way to expedite the optimization process by increasing one’s understanding of the factors affecting the radiosynthesis of a new tracer at an early stage of its development. As DoE aims to maximize the information that can be obtained from a limited number of experimental runs, well-constructed DoE studies would save time, reduce the experimental resources (expensive cartridges, reagents and hot-cell/lead-castle time) devoted to the development of new methods and the optimization of synthesis protocols for new tracers, and would lower the exposure of researchers to harmful ionizing radiation.

The aim of the presented study was to assess the usefulness of a DoE approach to the study and optimization of the CMRFs of model arylstannanes as disclosed by Makaravage et al . and to glean to insights into the most important experimental factors that must be considered when attempting to optimize a tracer syntheses using this methodology 11 . This information was applied to an RSO DoE constructed to optimize the late-stage CMRF of 2-{(4-[ 18 F]fluorophenyl)methoxy}pyrimidine-4-amine ([ 18 F] p FBC), a novel tracer under development in our group that had previously suffered from poor synthesis performance and proved difficult to optimize through the conventional approach. Additionally, we used an RSO study to optimize the single step production of 4-[ 18 F]fluorobenzyl alcohol ([ 18 F] p BnOH), an 18 F synthon of importance to a number of ongoing multistep radiosynthesis projects within our laboratory. In doing so, we highlight the use of DoE within the field of radiochemistry as a powerful tool to enhance radiochemical method development, expedite tracer synthesis optimization, and provide useful practical information about the process under investigation. This information could aid in general decision making when translating a radiosynthesis to an automated synthesis module, ultimately bringing it in line with current Good Manufacturing Practices (cGMP) for clinical production.

Results and Discussion

Ovat vs doe: the advantage of better optimization routines.

In order to assess the benefit of investigating the DoE approach for radiochemical process optimization, we studied the supplementary information of the original paper disclosing the CMRF of arylstannanes by Makaravage et al 11 . The authors investigated 8 non-discrete experimental factors, stating that each run was performed at least twice (n ≥ 2). Each factor was investigated across 3–6 different settings. Assuming n = 2 runs were performed for each setting, the authors therefore performed a minimum of 74 experimental runs (counted from the SI) to investigate the reaction’s behavior. Zarrad et al . later conducted a similar OVAT optimization study on a variation of this methodology that was based upon an improved QMA 18 F processing method suitable for large-scale automated syntheses 16 . While their study successfully led to the development of a scalable and automatable procedure for the production of a number of PET tracers from aryltrialkylstannnes, it was also done with great experimental effort.

In contrast, a fractional factorial Resolution IV (RES IV) DoE study consisting of as few as 19 runs, could be performed to identify which factors had the largest influence on the response (Sup. Table  1 ). If, for example, 3 factors were identified as significant, a high-resolution response surface optimization experiment (consisting of only 17 runs) could then be carried out to estimate a more detailed map of the experimental space. Thus, the DoE approach (across both FS and RSO studies) would, if valid, provide a more comprehensive model of the process in just 36 (vs 74) runs. This marked, potential improvement in experimental efficiency, prompted us to further investigate DoE as a tool for radiochemical optimization.

Factor screening of the CMRF of arylstannanes

In order to identify the factors that had the most significant effects on the reaction outcome, a factor screening Resolution V + (RES V + ) fractional factorial design (capable of resolving main effects, 2 factor interactions, and revealing the presence of curvature in the model) was constructed using Modde Go 12 (Umetrics). 4-Tributylstannylbiphenyl ( 1 ) was chosen as a model substrate due its availability, the low volatility of the product 4-[ 18 F]fluorobiphenyl ( [ 18 F]2 ) on TLC plates, and its prevalence in the literature as a standard model compound for radiofluorination method development (Fig.  3 ) 11 . The precursor amount was set at 2 mg (4.5 µmol) across all runs. Five factors, namely: Reaction solvent volume (DMA vol: 400–1000 µl DMA), temperature (Temp: 100–140 °C), copper triflate loading (Cu(OTf) 2 : 1–4 eq relative to substrate), pyridine loading (Pyridine: 4–30 eq), and atmosphere (Atm: argon vs air) were identified in pilot experiments and though literature consultation as factors of interest. A number of previous studies, including that reported by Zarrad et al ., have reported enhanced yields when these reactions are performed in air 9 , 15 , 16 , 19 . The effect of using argon or air on the %RCC was however difficult to compare and quantify during pilot experiments and thus it was included as a qualitative factor (argon or air) in the factor screening DoE. Time was not investigated as a factor as i) time is related to temperature in most chemical process and ii) given the short half-life time 18 F, it is more desirable to set a reaction duration of < 30 min. The radiochemical conversion of the reaction (%RCC) was chosen as the response ( Y %RCC ), as it can be quickly and accurately measured by radioTLC.

figure 3

The investigated factors and their ranges for the fractional factorial factor screening of the model synthesis 4-[ 18 F]fluorobiphenyl ( [ 18 F]2 ) from 4-tributyltinbiphenyl ( 1 ). In a fractional factorial design, experimental points are arranged at the corner of a K-dimensional hypercube. p is the total number of generators used to form the array (1/K p is the fraction of the total number of runs from the full factorial experiment (all vertices of a hypercube). Center points (CP, shown in Green) are repeated experiments carried out at the center of the hypercube to estimate reproducibility and measure curvature in the response surface.

The fractional factorial experimental design entailed a total of 24 experimental runs composed of 16 experimental points with 8 center point experiments. Due to the practical constraints of processing and using radiofluoride, the factor screening DoE (and future RSO DoE studies) needed to be run over multiple days. It was decided, given the time required to perform each reaction, that 6 experiments per day was optimal. To account for uncontrollable factors brought about through day-to-day variances in radiofluoride quality and quantity, QMA cartridge variations, and variations in QMA eluent, the experiments were arranged into 4 blocks of 6 runs. Each block, which contained two replicate center points to assess reproducibility, would be included into the model as blocking factors to account for variations in day-to-day uncontrollable factors.

18 F trapped on a QMA cartridge (preconditioned with NaHCO 3 ) was eluted with the QMA eluent as published by Makaravage et al . and was divided among the 6 reaction vials in 80 ul aliquots. The limitations of this “aliquot” method have been well documented in the literature 15 . The lower base/salt content present in smaller aliquot volumes of QMA eluent has less of a negative effect on %RCC than if a full QMA eluent “batch” is used. As such, %RCC values obtained via the aliquot method are often not representative of the %RCC obtained from large-scale batch elutions of 18 F with the same QMA eluent. However, despite this limitation, we chose to aliquot the 18 F into each reaction as this would better allow us to measure and account for variances in each QMA cartridge elution from day-to-day (between blocks) and would also ensure that the QMA eluent content present in each reaction vial after azeotropic drying would be reasonably stable within each block. The minimization of large sources of experimental error was of paramount important to the construction of an accurate DoE model.

After performing each run, the experimental results were analyzed using Modde Go 12 . To obtain a normal distribution of the data, the %RCC data set was transformed to the log(10), fitted to a model using multiple linear regression (MLR), and checked for outliers and model quality. The output summary statistics suggested the model to be good enough for the purposes of factor screening (R 2  = 0.91 (goodness of fit), Q 2  = 0.57 (goodness of model prediction). The normal coefficients of each term in the model were used to gauge the significance of the contribution of the corresponding factors to the response (p = 0.05) (Fig.  4 ). The model suggested that both temperature (Temp) and total DMA volume (DMA Vol: reaction volume/concentration) were non-significant factors over the investigated ranges. Catalyst loading, (Cu(OTf) 2 ) and ligand loading (Pyridine) were determined to be significant factors. The model also suggested the presence of curvature in the response surface, but due factor confounding inherent in the (RES V) experimental design, a more detailed RSO experimental would need to be conducted to determine which quadratic terms would be required to fit an accurate model. The presence of missing quadratic terms in the linear factor screening model could explain the low Q 2 term in the model fit statistics. Additionally, no significant differences between the experimental blocks (Block 1–4) were observed, suggesting the experimental protocol to be stable from day-to-day.

figure 4

The scaled and centered regression factors calculated from the results of fractional factorial factor screening DoE. Large regression coefficients represent factors with large contributions to the response (%RCC). A positive number denoted a positive influence on the response. A negative number denotes a diminishing effect on the response. To fit an accurate model, non-significant terms would need to be eliminated, but for the purposes of factor screening, these non-significant terms are shown here. If a factor’s regression coefficient is smaller than the associated errors bars it is probable (at the 95% confidence interval) that that factor is not significant.

The factor screening DoE also suggested that, when using stoichiometric quantities of Cu(OTf) 2 (1–4 eq), the choice of atmosphere (Atm (argon/air)) was not a significant factor and the presence of atmospheric oxygen does not significantly enhance the reaction over the ranges investigated. Interestingly however, a non-significant factor interaction between catalyst loading an atmosphere was detected. At high Cu(OTf) 2 loadings, an argon atmosphere is slightly preferred, while at low Cu(OTf) 2 loadings, an air atmosphere is beneficial. While its insignificance warrants that it is excluded from further experimental designs and models, the trend suggested by this interaction fits in line with the current understanding of the oxidation cycle of the Chan-lam coupling 30 . When catalytic quantities of Cu(II)(OTf) 2 are used, an oxidative atmosphere (Air) is required to activate catalytic complex to a Cu(III) species and to regenerate the catalyst after it undergoes reductive elimination. When larger amounts of Cu(II)(OTf) 2 are used, the reaction can be performed under argon as the oxidation of the inactive Cu(II) complex to the active Cu(III) complex is mediated by free Cu(II) through a single electron transfer 31 . An important conclusion from this result is that this CMRF can be performed in automated synthesizers using inert carrier gases; operating these reactions under air is not a requirement when stoichiometric loadings of Cu(OTf) 2 are used, as was originally suggested by Makaravage et al . 11 . Most routine radiosynthesis modules are setup and optimized to operate using an inert carrier gas such as nitrogen, argon or helium. While it is possible to setup and operate many synthesis modules using compressed air, it can be inconvenient to modify/change/switch established routine (or GMP) syntheses and synthesis modules to operate with air.

Response surface optimization of [ 18 F]pFBC

[ 18 F] p FBC ( [ 18 F]4 ), produced from precursor ( 3 ), is novel tracer under development in our laboratory that had shown poor synthesis performance and reliability (Fig.  5 ). Our efforts to optimize its synthesis iteratively through the OVAT approach in conjunction with previously published optimization data had given inconsistent and confusing results 11 , 16 . Thus, having identified and eliminated reaction solvent volume, temperature, atmosphere, and day-to-day uncontrollable factors as non-significant factors, a more detailed orthogonal central composite design (CCO) RSO study was constructed to optimize the radiosynthesis of this tracer. Cu(OTf) 2 loading (1–4 eq), pyridine loading (10–40 eq), and precursor loading (10–30 µmol) were chosen as factors for investigation. The reaction volume was kept constant across all runs at 700 µl and each run was performed at 110 °C for 15 min.

figure 5

The investigated factors and their ranges for the orthogonal central composite design RSO of [ 18 F]pFBC ( [ 18 F]4 ). Starpoint distance a is scaled in order to ensure orthogonality throughout the experimental matrix. An orthogonal central composite design (CCO) has a distance “a” scaled so as to ensure orthogonality in the experimental matrix.

The CCO design, a type of central composite design (CCD), was chosen due to its ability to estimate second order response surfaces and resolve quadratic terms in the response surface model. The CCO design consisted of a total of 17 runs: 8 factorial points, 3 center points and 6 orthogonally scaled star points (Fig.  5 ). The 17 runs were again carried out using 80 µl aliquots of 18 F in accordance with the general procedure described in the supplementary information. The data was modeled using MLR and analyzed in Modde Go 12 . All three main factors were found to be significant, and the experiment also resolved quadratic behaviors for both catalyst and pyridine loading factors (Fig.  6a ). Additionally, a factor interaction between pyridine and substrate loading was resolved and included in the model. The summary of fit statistics gave R 2 and Q 2 to be 0.97 and 0.91 respectively, indicating a valid and predictive model. All three main factors had significant effects on the response. Strong quadratic behaviors were found for both Cu(OTf) 2 loading and pyridine factors, and a strong negative factor interaction was detected between the equivalents of pyridine and the amount of substrate used (higher amounts of pyridine are needed for lower amounts of precursor.) Plotting the response surface across the investigated ranges suggested that the optimal set of conditions consisted of 3.5 equivalents of catalyst and 25 equivalents of pyridine (a ratio≈1:7) at a 10 µmol substrate load (Fig.  6b ). Thus, three validation runs were performed using larger 180 µl aliquots of the QMA solution (400–500 MBq) under the optimized conditions (Fig.  6c ). These three runs gave respective %RCCs of 24.9%, 25.3%, and 29.8% (26.7 ± 2.7%RCC (n = 3)), demonstrating the robustness of these conditions and giving the highest %RCCs obtained for [ 18 F] p FBC thus far, using this reaction.

figure 6

( a ) The scaled and centered regression factors calculated from the results of the RSO (CCO) ( a ) 4D plot output from Modde Go 12. Pyridine (ligand) and catalyst loadings are plotted on the vertical and horizontal axis respectively. The three windows, from right to left, represent an increasing amount of substrate (10–30 µmol). ( c ) Reaction conditions and radiochemical conversions of the optimized CMRF synthesis of [ 18 F] p FBC.

Response surface optimization of the synthesis of [ 18 F]4-fluorobenzyl alcohol ([ 18 F]pFBnOH)

The synthesis [ 18 F] p FBnOH [ 18 F]6 , an important radiochemical building block, has also been of interest to a number of projects within our laboratory. [ 18 F]6 has been previously synthesized in two steps via the nucleophilic aromatic substitution of 4-formyl- N,N,N -trimethylanilinium triflate and the subsequent reduction of the resulting 4-[ 18 F]fluorobenzaldehyde to [ 18 F]6 32 , 33 . In our hands the reduction step using NaBH 4 resulted in a significant loss of the product [ 18 F]6 and we thus chose to investigate the CMRF of 4-tributyltinbenzyl alcohol 5 as a possible single-step alternative route to [ 18 F]6 (Fig.  7 ). [ 18 F]6 could be reasonably purified via solid-phase extraction before use in a second synthesis step (these results will be published in due course.) Using the information obtained from our previous DoE studies, an RSO experiment was constructed to optimize the synthesis of [ 18 F]6 using a Box Behnken Design (BBD) (Fig.  7 ). The BBD requires slightly fewer runs than an equivalent CCD and also avoids experimental runs with combined extremes of the experimental factors. The three factor Box-Behnken design featured a total of 15 runs (12 experimental points with 3 center points). Again, substrate loading (5–25 µmol), catalyst loading (1–4 eq) and pyridine loading (5–30 eq) were chosen as factors for investigation. The reaction volume was again kept constant across all runs at 700 µl and the reactions were each performed according to the general procedure at 110 °C for 20 min.

figure 7

The investigated factors and their ranges of the Box Behnken response surface optimization design for the synthesis of [ 18 F] p FBnOH ( [ 18 F]6 ) from p-tributyltin-benzyl alcohol ( 5 ). The BBD arranges the experimental points on the edges of the reaction space cube and can be thought of as a combination of three 2D full factorial designs (performed at 90° to each other) with shared center points.

Fitting the data using MLR in MODDE Go 12 gave summary of fit statistics that suggested a valid model (R 2  = 0.97 and Q 2  = 0.86). Catalyst loading and pyridine loading were found to be significant factors, with pyridine demonstrating a quadratic behavior. In this case, precursor loading was not found to be a significant factor over the investigated range. Plotting the response surface suggested that the optimum reaction conditions featured a high catalyst load and a low pyridine load with a higher substrate load being slightly (non-significantly) beneficial (Fig.  8 ). Again, validation runs with larger 180 µl 18 F aliquots were performed as before using a substrate loading of 25 µmol, 4 equivalents of Cu(OTf) 2 and 5 equivalents of pyridine in 700 µl of DMA. The outcome afforded [ 18 F]6 with a %RCC of 58 ± 5.3% (n = 4) in a single step. While these results were less than the those predicted by the response surface model, they again provided the product with greater efficiency than previously obtained in our hands using the general fluoride processing and reaction conditions published by Makaravage et al .

figure 8

The response surface output from the Box Behnken response surface optimization of [ 18 F]6 .

In this case, deviations from the predicted model may be due to factors such as the amount of carbonate base present in the in larger volumes of QMA eluent solution (as discussed above), effects from as of yet unidentified controllable or non-controllable factors specific to this reaction, and/or model/data inaccuracies obtained through either random or systematic experimental error. Nonetheless, this easily automatable procedure mitigated the product losses sustained using the previously published 2-step synthesis approach 32 .

These data were used to successfully guide the automation of [ 18 F] p FBC as well as [ 18 F] p FBnOH (as part of larger multistep radiosynthesis project) on an Elixys Flex/Chem synthesis module (Sofie Biosciences, USA). These automated syntheses will be reported in due course as part of a larger tracer development study. It must however be noted that the radiochemical yields of the automated synthesis were, as expected, significantly lower than those predicted by the response surface model. This is in all likelihood due to the differences in single “batch” 18 F processing methods that were used in the automated synthesis versus the “aliquoted” 18 F processing that was used to carry out the DoE experiments. Although large scale radiosyntheses were nonetheless useful for imaging studies, this remains a significant limitation of the presented DoE studies. However, we suspect that the results of our reaction optimization can viewed independently from the known issues associated with fluoride processing, and we are currently working to confirm this hypothesis. As such, we believe that the application of improved fluoride processing techniques, such as the “minimalist” approach to fluoride processing, may help to drastically improve the large-scale performance of our optimized copper-mediated radiofluorination conditions and we are currently working to implement these methods into our workflow.

Comparing our factor screening and response surface models with the results obtained from the previous OVAT optimization studies by Makaravage et al . and Zarrad et al . reveals remarkably similar trends where the models are comparable (absolute %RCC values differ considerably due to the difference in the 18 F processing methods used.) For example, substrate load and copper triflate loading both show quadratic behaviors and their optima are reasonably well aligned with the analogous regions in our response surface models, despite the large differences in 18 F processing method. This lends weight to our hypothesis that the experimental factors affecting the reaction can be modeled separately from the 18 F processing conditions; i.e., there is no (or only a weak) factor interaction between the 18 F processing conditions used and reaction parameters we have investigated in this study; however, this still requires further investigation and will be reported on in due course. The multiparametric response surfaces provided by the DoE studies presented here also highlight the fact that much more information about a process can be obtained from fewer experiments if the DoE approach is appropriately applied.

Comparison of the two response surface models for [ 18 F] p FBC and [ 18 F] p FBnOH shows that the later requires a lower quantity of pyridine and a higher substrate concentration for optimal radiolabeling, while the synthesis of [ 18 F] p FBC benefits from a lower substrate concentration and higher pyridine load. This suggests that the nature of the substrate is a major factor when developing optimal CMRF reaction conditions. The presence of some heterocycles has been previously noted to have marked deleterious effects on %RCC, likely due to the formation of unreactive substrate/catalyst species. Taylor et al . examined the effects of various substrates on the analogous CMRF of boronic acid esters by performing reactions with a model substrate, while holding the reaction conditions constant and doping the reactions with various heterocycles and other common moieties often found in drug-like molecules 34 . From their results, they were able to construct a database of heterocyclic moieties that are compatible with their radiofluorination conditions that could be used to plan and “de-risk” future radiosyntheses. Our data suggests that, in certain cases, a detailed understanding of the process and careful optimization of important experimental factors could (to some degree) offset these deleterious effects, thus saving time by reducing the need to design complex multistep synthesis routes around problematic moieties in the candidate precursor. In combination with a database of problematic moieties (such as that published by Taylor et al .), well-designed DoE studies could aid in the establishment of useable radiofluorination protocols early on in a tracer’s development and thus expediate its passage from conception to its first preclinical studies. Scientist can then quickly decide if the tracer is biologically interesting and if further optimization or development of an improved synthetic strategy for GMP production is indeed warranted.

The work presented here highlights the benefit of using the DoE approach to aid in the development of new radiochemical methodologies as well as PET tracer development and production. The systematic use of the DoE approach streamlines the optimization process, saving time and resources while providing multiparametric information that can be used to guide decision-making early on during a tracer’s development. While we have specifically investigated the use of DoE for optimizing the copper-mediated radiofluorination of arylstannanes as proof of principle, it is important to note that DoE can be applied to any complex optimization problem. The availability of a number of easy to use DoE software packages (such as Modde Go 12 and JMP ) has allowed us to apply DoE to the synthesis optimization of a number of novel tracers under development and we are currently applying the presented DoE data and the general DoE approach to expedite the delivery of a number of biologically interesting tracers to imaging scientists within our group. We have also begun to explore the use of DoE as a research tool to guide reaction development and aid in the establishment of new radiochemical methodologies within our laboratory. We hope that DoE will become a more widely used tool that will help bring new radiochemical methods into clinical and preclinical relevancy and will in turn help expand the chemical diversity of new 18 F labelled tracers.

The synthesis procedures and characterization data of all precursor and non-radioactive standard compounds can be found the supplementary information attached to this paper along with the DoE design worksheets and regression model statistics.

General radiochemistry

As a general procedure for all radiochemical experiments, [ 18 F]fluoride in water was obtained from a cyclotron (GE PETtrace 800) target wash and was trapped on a QMA cartridge (QMA Light Carb, Waters; preconditioned sequentially with 1 M NaHCO 3 (10 ml), air (10 ml), Water (10 ml), and air (10 ml)), and eluted with a QMA eluent solution (K 2 CO 3 50 µg, KOTf 10 mg in H 2 O 550 µl.) (published by Makaravage et al .) To ensure consistency in the potassium [ 18 F]fluoride and potassium triflate content introduced from the QMA eluent, the eluted radiofluoride was aliquoted (80 µl) into 6 × 5 ml Wheaton (V-vials, oven dried) reactors (200–300 MBq) and each reactor was separately azeotropically dried at 110 °C with acetonitrile (3 × 1 ml) under a stream of argon gas. (As opposed to drying a single batch and aliquoting the poorly soluble [ 18 F]KF thereafter.) The reaction mixtures required by the DoE worksheet table were formulated from stock solutions of the required reagents in DMA (1 mg / 10 µl) and diluted with DMA to the required reaction volume. Reactions run under argon were purged with a stream of argon gas for 20 seconds. Reactions run under air were purged with air in similar fashion. The reactions were set to run at the required temperature for the specified time, after which they were quenched with 1 ml of water to solubilize the remaining fluoride. Samples of each reaction were taken for analysis.

Reaction analysis

RadioTLC was used to determine the relative incorporation of radiofluoride by the substrate and both product and by-product signals were quantified in order to determine %RCC. HPLC analysis was performed on representative samples over the course of the DoE studies to ensure compound identity.

Experimental design and analysis

All DoE studies were designed using the DoE software package Modde Go 12 ( Umetrics ). After the factors and responses of interest were defined, an appropriate design type was selected, and a DoE experimental worksheet table was generated. All experiments were performed in randomized order. After the %RCC data was collected, the data was modelled using MLR, checked for outliers and model quality, after which is could be used for the purposes of factor screening or response surface optimization.

van der Born, D. et al . Fluorine-18 labelled building blocks for PET tracer synthesis. Chem. Soc. Rev. 46 , 4709–4773 (2017).

Article   Google Scholar  

Brooks, A. F., Topczewski, J. J., Ichiishi, N., Sanford, M. S. & Scott, P. J. H. Late-stage [ 18 F]fluorination: New solutions to old problems. Chem. Sci. 5 , 4545–4553 (2014).

Article   CAS   Google Scholar  

Campbell, M. G. et al . Bridging the gaps in 18 F PET tracer development. Nat. Chem. 9 , 1–3 (2016).

Miller, P. W., Long, N. J., Vilar, R. & Gee, A. D. Synthesis of 11 C, 18 F, 15 O, and 13 N Radiolabels for Positron Emission Tomography. Angew. Chemie Int. Ed. 47 , 8998–9033 (2008).

Campbell, M. G. & Ritter, T. Modern carbon-fluorine bond forming reactions for aryl fluoride synthesis. Chem. Rev. 115 , 612–633 (2015).

Beyzavi, M. H. et al . 18 F-Deoxyfluorination of Phenols via Ru π-Complexes. ACS Cent. Sci. 3 , 944–948 (2017).

Lee, E. et al . A Fluoride-Derived Electrophilic Late-Stage Fluorination Reagent for PET Imaging. Science (80-.) 334 , 639–642 (2011).

Article   ADS   CAS   Google Scholar  

Preshlock, S., Tredwell, M. & Gouverneur, V. 18 F-Labeling of Arenes and Heteroarenes for Applications in Positron Emission Tomography. Chem. Rev. 116 , 719–766 (2016).

Tredwell, M. et al . A General Copper-Mediated Nucleophilic 18 F Fluorination of Arenes. Angew. Chemie Int. Ed. 53 , 7751–7755 (2014).

Mossine, A. V. et al . Synthesis of [ 18 F]Arenes via the Copper-Mediated [ 18 F]Fluorination of Boronic Acids. Org. Lett. 17 , 5780–5783 (2015).

Makaravage, K. J., Brooks, A. F., Mossine, A. V., Sanford, M. S. & Scott, P. J. H. H. Copper-Mediated Radiofluorination of Arylstannanes with [ 18 F]KF. Org. Lett. 18 , 5440–5443 (2016).

Ichiishi, N. et al . Copper-Catalyzed [18 F]Fluorination of (Mesityl)(aryl)iodonium Salts. Org. Lett. 16 , 3224–3227 (2014).

McCammant, M. S. et al . Cu-Mediated C-H 18 F-Fluorination of Electron-Rich (Hetero)arenes. Org. Lett. 19 , 3939–3942 (2017).

Preshlock, S. et al . Enhanced copper-mediated 18 F-fluorination of aryl boronic esters provides eight radiotracers for PET applications. Chem. Commun. 52 , 8361–8364 (2016).

Zlatopolskiy, B. D. et al . Copper-mediated aromatic radiofluorination revisited: Efficient production of PET tracers on a preparative scale. Chem. - A Eur. J. 21 , 5972–5979 (2015).

Zarrad, F., Zlatopolskiy, B. D., Krapf, P., Zischler, J. & Neumaier, B. A practical method for the preparation of 18 F-labeled aromatic amino acids from nucleophilic [ 18 F]fluoride and stannyl precursors for electrophilic radiohalogenation. Molecules 22 (2017).

Antuganov, D. et al . Copper-Mediated Radiofluorination of Aryl Pinacolboronate Esters: A Straightforward Protocol by Using Pyridinium Sulfonates. European J. Org. Chem. 2019 , 918–922 (2019).

Richarz, R. et al . Neither azeotropic drying, nor base nor other additives: A minimalist approach to 18 F-labeling. Org. Biomol. Chem. 12 , 8094–8099 (2014).

Zischler, J., Kolks, N., Modemann, D., Neumaier, B. & Zlatopolskiy, B. D. Alcohol-Enhanced Cu-Mediated Radiofluorination. Chem. - A Eur. J. 23 , 3251–3256 (2017).

Mossine, A. V. et al . Development of Customized [(18)F]Fluoride Elution Techniques for the Enhancement of Copper-Mediated Late-Stage Radiofluorination. Sci. Rep. 7 , 233 (2017).

Article   ADS   Google Scholar  

Murray, P. M. et al . The application of design of experiments (DoE) reaction optimisation and solvent selection in the development of new synthetic chemistry. Org. Biomol. Chem. 14 , 2373–2384 (2016).

Dejaegher, B. & Vander Heyden, Y. Experimental designs and their recent advances in set-up, data interpretation, and analytical applications. J. Pharm. Biomed. Anal. 56 , 141–158 (2011).

Lendrem, D. W. et al . Lost in space: Design of experiments and scientific exploration in a Hogarth Universe. Drug Discovery Today 20 , 1365–1371 (2015).

Murray, P. M., Tyler, S. N. G. & Moseley, J. D. Beyond the numbers: Charting chemical reaction space. Org. Process Res. Dev. 17 , 40–46 (2013).

Tye, H. Application of statistical ‘design of experiments’ methods in drug discovery. Drug Discov. Today 9 , 485–491 (2004).

Tye, H. & Whittaker, M. Use of a Design of Experiments approach for the optimisation of a microwave assisted Ugi reaction. Org. Biomol. Chem. 2 , 813–815 (2004).

Leardi, R. Experimental design in chemistry: A tutorial. Anal. Chim. Acta 652 , 161–172 (2009).

Mathiessen, B., Jensen, A. T. I. & Zhuravlev, F. Homogeneous nucleophilic radiofluorination and fluorination with phosphazene hydrofluorides. Chem. - A Eur. J. 17 , 7796–7805 (2011).

Mathiessen, B., Jensen, M. & Zhuravlev, F. [ 18 F] fluoride recovery via gaseous [ 18 F]HF. J. Label. Compd. Radiopharm. 54 , 816–818 (2011).

Qiao, J. X. & Lam, P. Y. S. Copper-promoted carbon-heteroatom bond cross-coupling with boronic acids and derivatives. Synthesis (Stuttg). 6 , 829–856 (2011).

King, A. E., Ryland, B. L., Brunold, T. C. & Stahl, S. S. Kinetic and spectroscopic studies of aerobic copper(II)-catalyzed methoxylation of arylboronic esters and insights into aryl transmetalation to copper(II). Organometallics 31 , 7948–7957 (2012).

Vaidyanathan, G. et al . Radiolabeled guanine derivatives for the in vivo mapping of O6-alkylguanine-DNA alkyltransferase: 6-(4-[ 18 F]fluoro-benzyloxy)-9H-purin-2-ylamine and 6-(3-[131I]iodo-benzyloxy)-9H-purin-2-ylamine. Bioconjug. Chem. 11 , 868–875 (2000).

Thonon, D., Kech, C., Paris, J., Lemaire, C. & Luxen, A. New Strategy for the Preparation of Clickable Peptides and Labeling with 1-(Azidomethyl)-4-[ 18 F]-fluorobenzene for PET. Bioconjug. Chem. 20 , 817–823 (2009).

Taylor, N. J. et al . Derisking the Cu-Mediated 18 F Fluorination of Heterocyclic Positron Emission Tomography Radioligands. J. Am. Chem. Soc. 139 , 8267–8276 (2017).

Murray, P. M. What is Experimental Design? catalysisconsulting.co.uk (2019). Available at, http://www.catalysisconsulting.co.uk/what-is-experimental-design.html . (Accessed: 3rd March 2019).

Download references

Acknowledgements

We would like to thank the Adolf Leuze Foundation and the Werner Siemens Foundation for their financial contributions towards this work. Funding for this work was also provided by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence Strategy – EXC 2180–390900677. We would also like to thank Dr Gerald Reischl, Marko Matijevic and other colleagues in the radiopharmacy at the Werner Siemens Imaging Center for the delivery of [ 18 F]fluoride and for technical assistance. Finally, we would like to thank Dr Paul Murray and Dr Laura Forfar from Catalyst Consulting for allowing us to use and adapt their figure form their website and for generously providing informative comments on our manuscript.

Author information

Authors and affiliations.

Werner Siemens Imaging Center, Department of Preclinical Imaging and Radiopharmacy, Eberhard Karls University, Tübingen, Germany

Gregory D. Bowden, Bernd J. Pichler & Andreas Maurer

iFIT-Cluster of Excellence, Eberhard Karls University, Tuebingen, Germany

Bernd J. Pichler & Andreas Maurer

You can also search for this author in PubMed   Google Scholar

Contributions

G.B. and A.M. conceived and designed experiments. G.B. performed the experiments. G.B., A.M., and B.J.P. analyzed data and wrote the paper.

Corresponding author

Correspondence to Andreas Maurer .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary information, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Bowden, G.D., Pichler, B.J. & Maurer, A. A Design of Experiments (DoE) Approach Accelerates the Optimization of Copper-Mediated 18 F-Fluorination Reactions of Arylstannanes. Sci Rep 9 , 11370 (2019). https://doi.org/10.1038/s41598-019-47846-6

Download citation

Received : 03 May 2019

Accepted : 23 July 2019

Published : 06 August 2019

DOI : https://doi.org/10.1038/s41598-019-47846-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

The effect of heat treatment parameters on the mechanical and microstructural properties of an astm a860 wphy 65 pipe fitting: an experimental investigation.

  • Mohamad Alagheband
  • Mehdi Ghanbari

Journal of Materials Engineering and Performance (2024)

Adaptable multi-objective optimization framework: application to metal additive manufacturing

  • Mohamed Imad Eddine Heddar
  • Brahim Mehdi
  • Mohammad Jahazi

The International Journal of Advanced Manufacturing Technology (2024)

Eco-friendly methodology for removing and recovering rare earth elements from saline industrial wastewater

  • Thainara Viana
  • Nicole Ferreira
  • Bruno Henriques

Environmental Science and Pollution Research (2023)

Microliter-scale reaction arrays for economical high-throughput experimentation in radiochemistry

  • Alejandra Rios
  • Travis S. Holloway
  • R. Michael van Dam

Scientific Reports (2022)

Comparison of two new graphene-based magnetic and non-magnetic nanocatalysts for Suzuki–Miyaura coupling and optimization of reaction conditions using design of experiment (DoE)

  • Faezeh Moniriyan
  • Seyyed Javad Sabounchei

Reaction Kinetics, Mechanisms and Catalysis (2022)

By submitting a comment you agree to abide by our Terms and Community Guidelines . If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

design of experiments review

  • DOI: 10.1021/OP500169M
  • Corpus ID: 97009542

Design of Experiments (DoE) and Process Optimization. A Review of Recent Publications

  • S. Weissman , N. Anderson
  • Published 20 November 2015
  • Engineering, Chemistry
  • Organic Process Research & Development

356 Citations

The application of advanced design ofexperiments for the efficient development ofchemical processes, analysis of design of experiments with dynamic responses, using design of experiments to guide genetic optimization of engineered metabolic pathways, enhanced process development using automated continuous reactors by self-optimisation algorithms and statistical empirical modelling, the application of design of experiments (doe) reaction optimisation and solvent selection in the development of new synthetic chemistry., application of open source based doe r program for the development of qbd, multivariate analysis and statistics in pharmaceutical process research and development., optimization and process improvement for lcz696 by employing quality by design (qbd) principles, a customized bayesian algorithm to optimize enzyme-catalyzed reactions, review of the process optimization in microbial fuel cell using design of experiment methodology, 94 references, doe (design of experiments) in development chemistry: potential obstacles, using doe to achieve reliable drug administration: a case study, current practices of process validation for drug substances and intermediates, fit for purpose experimental designs and analyses in chemical development, optimization of the mizoroki-heck reaction using design of experiment (doe), using continuous processes to increase production, use of doe for rapid development of a red-al reduction process for the synthesis of 3,4-isopropylidenedioxypyrrolidine hydrotosylate, process control limits from a laboratory study on the ni(0)-mediated coupling of ethyl acrylate with a c-22 steroidal iodide: a case study on the role of experimental design in highly developed processes, practical process for the air oxidation of cresols: part b. evaluation of the laboratory-scale oxidation process, the synthesis of gv143253a: a case study for the use of analytical and statistical tools to elucidate the reaction mechanism and to optimize the process, related papers.

Showing 1 through 3 of 0 Related Papers

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Guide to Experimental Design | Overview, Steps, & Examples

Guide to Experimental Design | Overview, 5 steps & Examples

Published on December 3, 2019 by Rebecca Bevans . Revised on June 21, 2023.

Experiments are used to study causal relationships . You manipulate one or more independent variables and measure their effect on one or more dependent variables.

Experimental design create a set of procedures to systematically test a hypothesis . A good experimental design requires a strong understanding of the system you are studying.

There are five key steps in designing an experiment:

  • Consider your variables and how they are related
  • Write a specific, testable hypothesis
  • Design experimental treatments to manipulate your independent variable
  • Assign subjects to groups, either between-subjects or within-subjects
  • Plan how you will measure your dependent variable

For valid conclusions, you also need to select a representative sample and control any  extraneous variables that might influence your results. If random assignment of participants to control and treatment groups is impossible, unethical, or highly difficult, consider an observational study instead. This minimizes several types of research bias, particularly sampling bias , survivorship bias , and attrition bias as time passes.

Table of contents

Step 1: define your variables, step 2: write your hypothesis, step 3: design your experimental treatments, step 4: assign your subjects to treatment groups, step 5: measure your dependent variable, other interesting articles, frequently asked questions about experiments.

You should begin with a specific research question . We will work with two research question examples, one from health sciences and one from ecology:

To translate your research question into an experimental hypothesis, you need to define the main variables and make predictions about how they are related.

Start by simply listing the independent and dependent variables .

Research question Independent variable Dependent variable
Phone use and sleep Minutes of phone use before sleep Hours of sleep per night
Temperature and soil respiration Air temperature just above the soil surface CO2 respired from soil

Then you need to think about possible extraneous and confounding variables and consider how you might control  them in your experiment.

Extraneous variable How to control
Phone use and sleep in sleep patterns among individuals. measure the average difference between sleep with phone use and sleep without phone use rather than the average amount of sleep per treatment group.
Temperature and soil respiration also affects respiration, and moisture can decrease with increasing temperature. monitor soil moisture and add water to make sure that soil moisture is consistent across all treatment plots.

Finally, you can put these variables together into a diagram. Use arrows to show the possible relationships between variables and include signs to show the expected direction of the relationships.

Diagram of the relationship between variables in a sleep experiment

Here we predict that increasing temperature will increase soil respiration and decrease soil moisture, while decreasing soil moisture will lead to decreased soil respiration.

Prevent plagiarism. Run a free check.

Now that you have a strong conceptual understanding of the system you are studying, you should be able to write a specific, testable hypothesis that addresses your research question.

Null hypothesis (H ) Alternate hypothesis (H )
Phone use and sleep Phone use before sleep does not correlate with the amount of sleep a person gets. Increasing phone use before sleep leads to a decrease in sleep.
Temperature and soil respiration Air temperature does not correlate with soil respiration. Increased air temperature leads to increased soil respiration.

The next steps will describe how to design a controlled experiment . In a controlled experiment, you must be able to:

  • Systematically and precisely manipulate the independent variable(s).
  • Precisely measure the dependent variable(s).
  • Control any potential confounding variables.

If your study system doesn’t match these criteria, there are other types of research you can use to answer your research question.

How you manipulate the independent variable can affect the experiment’s external validity – that is, the extent to which the results can be generalized and applied to the broader world.

First, you may need to decide how widely to vary your independent variable.

  • just slightly above the natural range for your study region.
  • over a wider range of temperatures to mimic future warming.
  • over an extreme range that is beyond any possible natural variation.

Second, you may need to choose how finely to vary your independent variable. Sometimes this choice is made for you by your experimental system, but often you will need to decide, and this will affect how much you can infer from your results.

  • a categorical variable : either as binary (yes/no) or as levels of a factor (no phone use, low phone use, high phone use).
  • a continuous variable (minutes of phone use measured every night).

How you apply your experimental treatments to your test subjects is crucial for obtaining valid and reliable results.

First, you need to consider the study size : how many individuals will be included in the experiment? In general, the more subjects you include, the greater your experiment’s statistical power , which determines how much confidence you can have in your results.

Then you need to randomly assign your subjects to treatment groups . Each group receives a different level of the treatment (e.g. no phone use, low phone use, high phone use).

You should also include a control group , which receives no treatment. The control group tells us what would have happened to your test subjects without any experimental intervention.

When assigning your subjects to groups, there are two main choices you need to make:

  • A completely randomized design vs a randomized block design .
  • A between-subjects design vs a within-subjects design .

Randomization

An experiment can be completely randomized or randomized within blocks (aka strata):

  • In a completely randomized design , every subject is assigned to a treatment group at random.
  • In a randomized block design (aka stratified random design), subjects are first grouped according to a characteristic they share, and then randomly assigned to treatments within those groups.
Completely randomized design Randomized block design
Phone use and sleep Subjects are all randomly assigned a level of phone use using a random number generator. Subjects are first grouped by age, and then phone use treatments are randomly assigned within these groups.
Temperature and soil respiration Warming treatments are assigned to soil plots at random by using a number generator to generate map coordinates within the study area. Soils are first grouped by average rainfall, and then treatment plots are randomly assigned within these groups.

Sometimes randomization isn’t practical or ethical , so researchers create partially-random or even non-random designs. An experimental design where treatments aren’t randomly assigned is called a quasi-experimental design .

Between-subjects vs. within-subjects

In a between-subjects design (also known as an independent measures design or classic ANOVA design), individuals receive only one of the possible levels of an experimental treatment.

In medical or social research, you might also use matched pairs within your between-subjects design to make sure that each treatment group contains the same variety of test subjects in the same proportions.

In a within-subjects design (also known as a repeated measures design), every individual receives each of the experimental treatments consecutively, and their responses to each treatment are measured.

Within-subjects or repeated measures can also refer to an experimental design where an effect emerges over time, and individual responses are measured over time in order to measure this effect as it emerges.

Counterbalancing (randomizing or reversing the order of treatments among subjects) is often used in within-subjects designs to ensure that the order of treatment application doesn’t influence the results of the experiment.

Between-subjects (independent measures) design Within-subjects (repeated measures) design
Phone use and sleep Subjects are randomly assigned a level of phone use (none, low, or high) and follow that level of phone use throughout the experiment. Subjects are assigned consecutively to zero, low, and high levels of phone use throughout the experiment, and the order in which they follow these treatments is randomized.
Temperature and soil respiration Warming treatments are assigned to soil plots at random and the soils are kept at this temperature throughout the experiment. Every plot receives each warming treatment (1, 3, 5, 8, and 10C above ambient temperatures) consecutively over the course of the experiment, and the order in which they receive these treatments is randomized.

Finally, you need to decide how you’ll collect data on your dependent variable outcomes. You should aim for reliable and valid measurements that minimize research bias or error.

Some variables, like temperature, can be objectively measured with scientific instruments. Others may need to be operationalized to turn them into measurable observations.

  • Ask participants to record what time they go to sleep and get up each day.
  • Ask participants to wear a sleep tracker.

How precisely you measure your dependent variable also affects the kinds of statistical analysis you can use on your data.

Experiments are always context-dependent, and a good experimental design will take into account all of the unique considerations of your study system to produce information that is both valid and relevant to your research question.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Likert scale

Research bias

  • Implicit bias
  • Framing effect
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic

Experimental design means planning a set of procedures to investigate a relationship between variables . To design a controlled experiment, you need:

  • A testable hypothesis
  • At least one independent variable that can be precisely manipulated
  • At least one dependent variable that can be precisely measured

When designing the experiment, you decide:

  • How you will manipulate the variable(s)
  • How you will control for any potential confounding variables
  • How many subjects or samples will be included in the study
  • How subjects will be assigned to treatment levels

Experimental design is essential to the internal and external validity of your experiment.

The key difference between observational studies and experimental designs is that a well-done observational study does not influence the responses of participants, while experiments do have some sort of treatment condition applied to at least some participants by random assignment .

A confounding variable , also called a confounder or confounding factor, is a third variable in a study examining a potential cause-and-effect relationship.

A confounding variable is related to both the supposed cause and the supposed effect of the study. It can be difficult to separate the true effect of the independent variable from the effect of the confounding variable.

In your research design , it’s important to identify potential confounding variables and plan how you will reduce their impact.

In a between-subjects design , every participant experiences only one condition, and researchers assess group differences between participants in various conditions.

In a within-subjects design , each participant experiences all conditions, and researchers test the same participants repeatedly for differences between conditions.

The word “between” means that you’re comparing different conditions between groups, while the word “within” means you’re comparing different conditions within the same group.

An experimental group, also known as a treatment group, receives the treatment whose effect researchers wish to study, whereas a control group does not. They should be identical in all other ways.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bevans, R. (2023, June 21). Guide to Experimental Design | Overview, 5 steps & Examples. Scribbr. Retrieved September 3, 2024, from https://www.scribbr.com/methodology/experimental-design/

Is this article helpful?

Rebecca Bevans

Rebecca Bevans

Other students also liked, random assignment in experiments | introduction & examples, quasi-experimental design | definition, types & examples, how to write a lab report, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

W

  • General & Introductory Industrial Engineering

design of experiments review

Design of Experiments: A Modern Approach, 1st Edition

ISBN: 978-1-119-61119-6

December 2019

PREFER DIGITAL VERSIONS OF YOUR TEXTBOOKS?

Get instant access to your Wiley eBook. Buy or rent eBooks for a period of up to 150 days.

Digital Evaluation Copy

design of experiments review

Bradley Jones , Douglas C. Montgomery

Design of Experiments: A Modern Approach introduces readers to planning and conducting experiments, analyzing the resulting data, and obtaining valid and objective conclusions. This innovative textbook uses design optimization as its design construction approach, focusing on practical experiments in engineering, science, and business rather than orthogonal designs and extensive analysis. Requiring only first-course knowledge of statistics and familiarity with matrix algebra, student-friendly chapters cover the design process for a range of various types of experiments.

The text follows a traditional outline for a design of experiments course, beginning with an introduction to the topic, historical notes, a review of fundamental statistics concepts, and a systematic process for designing and conducting experiments. Subsequent chapters cover simple comparative experiments, variance analysis, two-factor factorial experiments, randomized complete block design, response surface methodology, designs for nonlinear models, and more. Readers gain a solid understanding of the role of experimentation in technology commercialization and product realization activities—including new product design, manufacturing process development, and process improvement—as well as many applications of designed experiments in other areas such as marketing, service operations, e-commerce, and general business operations.

  • Uses flexible, practically-applicable design optimization as design construction approach to address the unique features of a design problem
  • Reviews basic statistical and experiment design concepts and methods
  • Covers the four basic principles of experimental design: the factorial principle, randomization, replication, and blocking
  • Presents definitive screening designs as a three-level alternative to standard screening designs
  • Relies on software for calculation and analysis of important design material
  • Includes numerous charts, graphs, tables, illustrations, and end-of-chapter problems

This page does not exist in your selected language. Your preference was saved and you will be notified once a page can be viewed in your language.

This page is also available in your prefered language. Switch to that version.

  • Science Snippets Blog

What is DOE? Design of Experiments Basics for Beginners

[This blog was a favorite last year, so we thought you'd like to see it again. Send us your comments!]. Whether you work in engineering, R&D, or a science lab, understanding the basics of experimental design can help you achieve more statistically optimal results from your experiments or improve your output quality.

This article is posted on our Science Snippets Blog .

design of experiments review

Using  Design of Experiments (DOE)  techniques, you can determine the individual and interactive effects of various factors that can influence the output results of your measurements. You can also use DOE to gain knowledge and estimate the best operating conditions of a system, process or product.

DOE applies to many different investigation objectives, but can be especially important early on in a screening investigation to help you determine what the most important factors are. Then, it may help you optimize and better understand how the most important factors that you can regulate influence the responses or critical quality attributes.

Another important application area for DOE is in making production more effective by identifying factors that can reduce material and energy consumption or minimize costs and waiting time. It is also valuable for robustness testing to ensure quality before releasing a product or system to the market.

What’s the Alternative?

In order to understand why Design of Experiments is so valuable, it may be helpful to take a look at what DOE helps you achieve. A good way to illustrate this is by looking at an alternative approach, one that we call the  “COST”  approach. The COST ( C hange  O ne  S eparate factor at a  T ime) approach might be considered an intuitive or even logical way to approach your experimentation options (until, that is, you have been exposed to the ideas and thinking of DOE).

Let’s consider the example of a small chemical reaction where the goal is to find optimal conditions for yield. In this example, we can vary only two elements, or factors:

  • the volume of the reaction container (between 500 and 700 ml), and
  • the pH of the solution (between 2.5 and 5).

We change the experimental factors and measure the response outcome, which in this case, is the yield of the desired product. Using the COST approach, we can vary just one of the factors at time to see what affect it has on the yield.

So, for example, first we might fix the pH at 3, and change the volume of the reaction container from a low setting of 500ml to a high of 700ml. From that we can measure the yield.

Below is an example of a table that shows the yield that was obtained when changing the volume from 500 to 700 ml. In the scatterplot on the right, we have plotted the measured yield against the change in reaction volume, and it doesn’t take long to see that the best volume is located at 550 ml.

Next, we evaluate what will happen when we fix the volume at 550 ml (the optimal level) and start to change the second factor. In this second experimental series, the pH is changed from 2.5 to 5.0 and you can see the measured yields. These are listed in the table and plotted below. From this we can see that the optimal pH is around 4.5.

The optimal combination for the best yield would be a volume of 550 ml and pH 4.5. Sounds good right? But, let’s consider this a bit more.

Gaining a Better Perspective With DOE

What happens when we take more of a bird’s eye perspective, and look at the overall experimental map by number and order of experiments?

For example, in the first experimental series (indicated on the horizontal axis below), we moved the experimental settings from left to right, and we found out that 550 was the optimal volume.

Then in the second experimental series, we moved from bottom to top (as shown in the scatterplot below) and after a while we found out that the best yield was at experiment number 10 (4.5 pH).

The problem here is that we are not really certain whether the experimental point number 10 is truly the best one. The risk is that we have perceived that as being the optimum without it really being the case. Another thing we may question is the number of experiments we used. Have we used the optimal number of runs for experiments?

Zooming out and picturing what we have done on a map, we can see that we have only been exploiting a very small part of the entire experimental space. The true relationship between pH and volume is represented by the Contour Plot pictured below. We can see that the optimal value would be somewhere at the top in the larger red area.

So the problem with the COST approach is that we can get very different implications if we choose other starting points. We perceive that the optimum was found, but the other— and perhaps more problematic thing—is that we didn’t realize that continuing to do additional experiments would produce even higher yields.

How to Design Better Experiments

Instead, using the DOE approach, we can build a map in a much better way. First, consider the use of just two factors, which would mean that we have a limited range of experiments.  As the contour plot below shows, we would have at least four experiments (defining the corners of a rectangle.)

These four points can be optimally supplemented by a couple of points representing the variation in the interior part of the experimental design.

The important thing here is that when we start to evaluate the result, we will obtain very valuable information about the direction in which to move for improving the result. We will understand that we should reposition the experimental plan according to the dashed arrow.

However, DOE is NOT limited to looking at just two factors. It can be applied to three, four or many more factors.

If we take the approach of using three factors, the experimental protocol will start to define a cube rather than a rectangle. So the factorial points will be the corners of the cube.

In this way, DOE allows you to construct a carefully prepared set of representative experiments, in which all relevant factors are varied simultaneously.

DOE is about creating an entity of experiments that work together to map an interesting experimental region. So with DOE we can prepare a set of experiments that are optimally placed to bring back as much information as possible about how the factors are influencing the responses.

Plus, we will we have support for different types of regression models. For example, we can estimate what we call a linear model, or an interaction model, or a quadratic model. So the selected experimental plan will support a specific type of model.

Why Is DOE a Better Approach?

We can see three main reasons that DOE Is a better approach to experiment design than the COST approach.

DOE suggests the correct number of runs needed (often fewer than used by the COST approach)

DOE provides a model for the direction to follow

Many factors can be used (not just two)

In summary, the benefits of DOE are:

  • An organized approach that connects experiments in a rational manner
  • The influence of and interactions between all factors can be estimated
  • More precise information is acquired in fewer experiments
  • Results are evaluated in the light of variability
  • Support for decision-marketing: map of the system (response contour plot)

Download Presentation

Watch the Webinar Video

Please select your country so we can show you products that are available for you.

The content of our website is always available in English and partly in other languages. Choose your preferred language and we will show you the content in that language, if available.

Going beyond the comparison: toward experimental instructional design research with impact

  • Methodology
  • Published: 28 August 2024

Cite this article

design of experiments review

  • Adam G. Gavarkovs 1 ,
  • Rashmi A. Kusurkar 2 , 3 , 4 ,
  • Kulamakan Kulasegaram 5 , 6 &
  • Ryan Brydges 6 , 7  

56 Accesses

1 Altmetric

Explore all metrics

To design effective instruction, educators need to know what design strategies are generally effective and why these strategies work, based on the mechanisms through which they operate. Experimental comparison studies, which compare one instructional design against another, can generate much needed evidence in support of effective design strategies. However, experimental comparison studies are often not equipped to generate evidence regarding the mechanisms through which strategies operate. Therefore, simply conducting experimental comparison studies may not provide educators with all the information they need to design more effective instruction. To generate evidence for the what and the why of design strategies, we advocate for researchers to conduct experimental comparison studies that include mediation or moderation analyses, which can illuminate the mechanisms through which design strategies operate. The purpose of this article is to provide a conceptual overview of mediation and moderation analyses for researchers who conduct experimental comparison studies in instructional design. While these statistical techniques add complexity to study design and analysis, they hold great promise for providing educators with more powerful information upon which to base their instructional design decisions. Using two real-world examples from our own work, we describe the structure of mediation and moderation analyses, emphasizing the need to control for confounding even in the context of experimental studies. We also discuss the importance of using learning theories to help identify mediating or moderating variables to test.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

design of experiments review

Research-Based Instructional Perspectives

design of experiments review

To prove or improve, that is the question: the resurgence of comparative, confounded research between 2010 and 2019

design of experiments review

Instructional Design Methods and Practice

Explore related subjects.

  • Artificial Intelligence

Data availability

No datasets were generated or analysed during the current study.

As an alternative to the regression approach, structural equation modelling (SEM) has gained popularity in the health professions education literature (Stoffels et al., 2023 ). SEM requires that a researcher make additional assumptions regarding the functional relationships between the covariates, the mediator(s), and the outcome(s) (VanderWeele, 2012 ). Though specifying these relationships can increase power, it comes with an increased risk of model misspecification (VanderWeele, 2012 ). Accordingly, we recommend that researchers beginning with experimental comparison studies involving a single mediator opt for using the regression-based approach with controls for mediator-outcome confounding (VanderWeele, 2012 ).

We did not actually analyze our data in the manner described below, for reasons described in our published manuscript. Here, we describe an alternative data analysis strategy for clarity.

Baron, R. M., & Kenny, D. A. (1986). The moderator–mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51 (6), 1173–1182. https://doi.org/10.1037/0022-3514.51.6.1173

Article   Google Scholar  

Bürkner, P. C. (2017). brms: An R package for Bayesian multilevel models using Stan . Journal of Statistical Software . https://doi.org/10.18637/jss.v080.i01

Carver, C. S., & Scheier, M. F. (1998). On the Self-Regulation of Behavior (1st ed.). Cambridge University Press. https://doi.org/10.1017/CBO9781139174794

Book   Google Scholar  

Cheung, J. J. H., & Kulasegaram, K. M. (2022). Beyond the tensions within transfer theories: Implications for adaptive expertise in the health professions. Advances in Health Sciences Education, 27 (5), 1293–1315. https://doi.org/10.1007/s10459-022-10174-y

Cheung, J. J. H., Kulasegaram, K. M., Woods, N. N., & Brydges, R. (2019). Why Content and cognition matter: Integrating conceptual knowledge to support simulation-based procedural skills transfer. Journal of General Internal Medicine, 34 (6), 969–977. https://doi.org/10.1007/s11606-019-04959-y

Cheung, J. J. H., Kulasegaram, K. M., Woods, N. N., & Brydges, R. (2021). Making concepts material: A randomized trial exploring simulation as a medium to enhance cognitive integration and transfer of learning. Simulation in Healthcare: THe Journal of the Society for Simulation in Healthcare, 16 (6), 392–400. https://doi.org/10.1097/SIH.0000000000000543

Cheung, J. J. H., Kulasegaram, K. M., Woods, N. N., Moulton, C., Ringsted, C. V., & Brydges, R. (2018). Knowing How and Knowing Why: Testing the effect of instruction designed for cognitive integration on procedural skills transfer. Advances in Health Sciences Education, 23 (1), 61–74. https://doi.org/10.1007/s10459-017-9774-1

Cook, D. A. (2005). The research we still are not doing: An agenda for the study of computer-based learning. Academic Medicine, 80 (6), 541–548. https://doi.org/10.1097/00001888-200506000-00005

Cook, D. A. (2009). The failure of e-learning research to inform educational practice, and what we can do about it. Medical Teacher, 31 (2), 158–162. https://doi.org/10.1080/01421590802691393

Durik, A. M., Shechter, O. G., Noh, M., Rozek, C. S., & Harackiewicz, J. M. (2015). What if I can’t? Success expectancies moderate the effects of utility value information on situational interest and performance. Motivation and Emotion, 39 (1), 104–118. https://doi.org/10.1007/s11031-014-9419-0

Ertmer, P. A., & Stepich, D. A. (2005). Instructional design expertise: How will we know it when we see it? Educational Technology, 45 (6), 38–43.

Google Scholar  

Fiorella, L., & Mayer, R. E. (2016). Eight ways to promote generative learning. Educational Psychology Review, 28 (4), 717–741. https://doi.org/10.1007/s10648-015-9348-9

Friedman, C. P. (1994). The research we should be doing. Academic Medicine, 69 (6), 455–457. https://doi.org/10.1097/00001888-199406000-00005

Gavarkovs, A. G., Crukley, J., Miller, E., Kusurkar, R. A., Kulasegaram, K., & Brydges, R. (2023a). Effectiveness of life goal framing to motivate medical students during online learning: A randomized controlled trial. Perspectives on Medical Education, 12 (1), 444–454. https://doi.org/10.5334/pme.1017

Gavarkovs, A. G., Finan, E., Jensen, R. D., & Brydges, R. (2024). When I say … active learning. Medical Education . https://doi.org/10.1111/medu.15383

Gavarkovs, A. G., Kusurkar, R. A., & Brydges, R. (2023b). The purpose, adaptability, confidence, and engrossment model: A novel approach for supporting professional trainees’ motivation, engagement, and academic achievement. Frontiers in Education, 8 , 1036539. https://doi.org/10.3389/feduc.2023.1036539

Hardré, P. L., Ge, X., & Thomas, M. K. (2005). Toward a model of development for instructional design expertise. Educational Technology, 45 (1), 53–57.

Hatano, G. & Inagaki, I. (1986). Two courses of expertise. In Child Development and Education in Japan (pp. 262–272). W. H. Freeman.

Hayes, A. F. (2022). Introduction to mediation, moderation, and conditional process analysis: A regression-based approach (3rd ed.). The Guilford Press.

Kalyuga, S. (2007). Expertise reversal effect and its implications for learner-tailored instruction. Educational Psychology Review, 19 (4), 509–539. https://doi.org/10.1007/s10648-007-9054-3

Kusurkar, R. A. (2023). Self-determination theory in health professions education research and practice. In R. M. Ryan (Ed.), The oxford handbook of self-determination theory (pp. 665–683). Oxford University Press. https://doi.org/10.1093/oxfordhb/9780197600047.013.33

Chapter   Google Scholar  

Kusurkar, R. A., Croiset, G., & Ten Cate, OTh. J. (2011). Twelve tips to stimulate intrinsic motivation in students through autonomy-supportive classroom teaching derived from Self-Determination Theory. Medical Teacher, 33 (12), 978–982. https://doi.org/10.3109/0142159X.2011.599896

Laidley, T. L., & Braddock, C. H. (2000). Role of adult learning theory in evaluating and designing strategies for teaching residents in ambulatory settings. Advances in Health Sciences Education, 5 (1), 43–54. https://doi.org/10.1023/A:1009863211233

Lawson, A. P., & Mayer, R. E. (2021). Benefits of writing an explanation during pauses in multimedia lessons. Educational Psychology Review, 33 (4), 1859–1885. https://doi.org/10.1007/s10648-021-09594-w

Maheu-Cadotte, M.-A., Cossette, S., Dubé, V., Fontaine, G., Lavallée, A., Lavoie, P., Mailhot, T., & Deschênes, M.-F. (2021). Efficacy of serious games in healthcare professions education: A systematic review and meta-analysis. Simulation in Healthcare: THe Journal of the Society for Simulation in Healthcare, 16 (3), 199–212. https://doi.org/10.1097/SIH.0000000000000512

Mann, K. V. (2004). The role of educational theory in continuing medical education: Has it helped us? Journal of Continuing Education in the Health Professions, 24 (Supplement 1), S22–S30. https://doi.org/10.1002/chp.1340240505

Mayer, R. E. (2023). How to assess whether an instructional intervention has an effect on learning. Educational Psychology Review, 35 (2), 64. https://doi.org/10.1007/s10648-023-09783-9

Schoemann, A. M., Boulton, A. J., & Short, S. D. (2017). Determining power and sample size for simple and complex mediation models. Social Psychological and Personality Science, 8 (4), 379–386. https://doi.org/10.1177/1948550617715068

Shadish, W. R., Cook, T. D., & Campbell, D. T. (2001). Experimental and quasi-experimental designs for generalized causal inference . Houghton Mifflin.

Spencer, S. J., Zanna, M. P., & Fong, G. T. (2005). Establishing a causal chain: Why experiments are often more effective than mediational analyses in examining psychological processes. Journal of Personality and Social Psychology, 89 (6), 845–851. https://doi.org/10.1037/0022-3514.89.6.845

Stoffels, M., Torre, D. M., Sturgis, P., Koster, A. S., Westein, M. P. D., & Kusurkar, R. A. (2023). Steps and decisions involved when conducting structural equation modeling (SEM) analysis. Medical Teacher . https://doi.org/10.1080/0142159X.2023.2263233

Tai, A.-S., Lin, S.-H., Chu, Y.-C., Yu, T., Puhan, M. A., & VanderWeele, T. (2023). Causal mediation analysis with multiple time-varying mediators. Epidemiology, 34 (1), 8–19. https://doi.org/10.1097/EDE.0000000000001555

VanderWeele, T. J. (2012). Invited commentary: Structural equation models and epidemiologic analysis. American Journal of Epidemiology, 176 (7), 608–612. https://doi.org/10.1093/aje/kws213

VanderWeele, T. J. (2015). Explanation in causal inference: Methods for mediation and interaction . Oxford University Press.

VanderWeele, T. J. (2016). Mediation analysis: A practitioner’s guide. Annual Review of Public Health, 37 (1), 17–32. https://doi.org/10.1146/annurev-publhealth-032315-021402

VanderWeele, T. J., & Knol, M. J. (2014). A tutorial on interaction. Epidemiologic Methods . https://doi.org/10.1515/em-2013-0005

Woods, N. N., Brooks, L. R., & Norman, G. R. (2007). It all make sense: Biomedical knowledge, causal connections and memory in the novice diagnostician. Advances in Health Sciences Education, 12 (4), 405–415. https://doi.org/10.1007/s10459-006-9055-x

Download references

Author information

Authors and affiliations.

Faculty of Medicine, University of British Columbia, City Square East Tower, 555 W 12th Ave, Suite 200, Vancouver, BC, V5Z 3X7, Canada

Adam G. Gavarkovs

Research in Education, Amsterdam UMC Location Vrije Universiteit Amsterdam, De Boelelaan 1118, Amsterdam, The Netherlands

Rashmi A. Kusurkar

LEARN! Research Institute for Learning and Education, Faculty of Psychology and Education, VU University Amsterdam, Amsterdam, The Netherlands

Amsterdam Public Health, Quality of Care, Amsterdam, The Netherlands

Department of Family and Community Medicine, Temerty Faculty of Medicine, University of Toronto, Toronto, ON, Canada

Kulamakan Kulasegaram

The Wilson Centre, University of Toronto/University Health Network, Toronto, ON, Canada

Kulamakan Kulasegaram & Ryan Brydges

Department of Medicine, Temerty Faculty of Medicine, University of Toronto, Toronto, ON, Canada

Ryan Brydges

You can also search for this author in PubMed   Google Scholar

Contributions

A.G. conceptualized the topic of the manuscript and wrote the first draft. R.K., K.K., and R.B. provided contributions to subsequent drafts of the manuscript. All authors reviewed the final version of the manuscript.

Corresponding author

Correspondence to Adam G. Gavarkovs .

Ethics declarations

Conflict of interest.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Gavarkovs, A.G., Kusurkar, R.A., Kulasegaram, K. et al. Going beyond the comparison: toward experimental instructional design research with impact. Adv in Health Sci Educ (2024). https://doi.org/10.1007/s10459-024-10365-9

Download citation

Received : 06 March 2024

Accepted : 05 August 2024

Published : 28 August 2024

DOI : https://doi.org/10.1007/s10459-024-10365-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Randomized controlled trial
  • Quantitative data analysis
  • Learning theory
  • Find a journal
  • Publish with us
  • Track your research

Physical Review A

Covering atomic, molecular, and optical physics and quantum science.

  • Collections
  • Editorial Team

Tomography-assisted noisy quantum circuit simulator using matrix product density operators

Wei-guo ma, yun-hao shi, kai xu, and heng fan, phys. rev. a 110 , 032604 – published 4 september 2024.

  • No Citing Articles
  • INTRODUCTION
  • PRELIMINARY
  • REAL NOISE FROM EXPERIMENT
  • SIMULATIONS AND APPLICATIONS
  • CONCLUSION AND DISCUSSION
  • ACKNOWLEDGMENTS

In recent years, efficient quantum circuit simulations incorporating ideal noise assumptions have relied on tensor network simulators, particularly leveraging the matrix product density operator (MPDO) framework. However, experiments on real noisy intermediate-scale quantum (NISQ) devices often involve complex noise profiles, encompassing uncontrollable elements and instrument-specific effects such as crosstalk. To address these challenges, we employ quantum process tomography (QPT) techniques to directly capture the operational characteristics of the experimental setup and integrate them into numerical simulations using MPDOs. Our QPT-assisted MPDO simulator is then applied to explore a variational approach for generating noisy entangled states, comparing the results with standard noise numerical simulations and demonstrations conducted on the Quafu cloud quantum computation platform. Additionally, we investigate noisy MaxCut problems, as well as the effects of crosstalk and noise truncation. Our results provide valuable insights into the impact of noise on NISQ devices and lay the foundation for enhanced design and assessment of quantum algorithms in complex noise environments.

Figure

  • Received 12 May 2024
  • Accepted 20 August 2024

DOI: https://doi.org/10.1103/PhysRevA.110.032604

©2024 American Physical Society

Physics Subject Headings (PhySH)

  • Research Areas

Authors & Affiliations

  • 1 Institute of Physics, Chinese Academy of Sciences , Beijing 100190, China
  • 2 School of Physical Sciences, University of Chinese Academy of Sciences , Beijing 100049, China
  • 3 Beijing Academy of Quantum Information Sciences , Beijing 100193, China
  • 4 CAS Center for Excellence in Topological Quantum Computation , UCAS , Beijing 100049, China
  • * Contact author: [email protected]
  • † Contact author: [email protected]

Article Text (Subscription Required)

References (subscription required).

Vol. 110, Iss. 3 — September 2024

Access Options

  • Buy Article »
  • Log in with individual APS Journal Account »
  • Log in with a username/password provided by your institution »
  • Get access through a U.S. public or high school library »

design of experiments review

Authorization Required

Other options.

  • Buy Article »
  • Find an Institution with the Article »

Download & Share

Quantum noise model. (a) The general quantum noise arrangement in an actual circuit involves noise channels with time evolution and quantum operations. Errors can occur during the state preparation and measurement stages (whose channel is called SPAM), propagate during single-qubit gates, and accumulate over time through processes like depolarizing channel (DC) and thermal relaxation channel (TRC). The effect of noise in two-qubit gates varies across different models, with efficient methods devised to simulate and address these noise sources. (b) The unified noisy circuit model. In this model, the depolarizing channel is exclusively applied to the “target” qubit, sparing the “control” qubit from this particular error source. (c) General ideal noisy circuit model. In contrast to the unified model, this ideal model imposes both depolarizing and thermal relaxation channels on the qubits where two-qubit gates are applied, resulting in more complex noise interactions.

(a) MPDOs representation of the density matrix for an open quantum system, where qubits are linked to their conjugate spaces by a purple leg, introduced by quantum noise. Each qubit is modeled by a blue tensor denoted as T [ b ] , while its corresponding conjugate is represented by T [ p ] . Index i represents the content associated with the i th bit. Through the contraction process, T [ b ] and T [ p ] yield the density matrix M . (b) Transforming the Kraus representation of a quantum noise channel into a higher-order tensor representation. (c) Implementing the high-order tensor representation of the noise channel on a single-qubit quantum gate. Upon contraction, it gives rise to a noisy single-qubit gate, which is then connected to the respective qubit.

(a) SVD truncation on the dimension of the noise inner indices with limitation κ . (b) SVD truncation on the dimension of the bond indices with the limitation χ .

Quantum circuit contraction for MPDOs. Purple lines represent classical noise, and red lines depict quantum entanglement between qubits. The application of noisy gates introduces purple noise lines, and two-bit gates establish red entanglement lines between the qubits.

Experimental quantum circuit. (a) Illustration depicting a quantum gate operation, where unaccounted environmental noises are represented as a dark, nebulous cloud surrounding the gate. This emphasizes the ubiquitous presence of noise during gate execution, affecting the fidelity of the quantum operation. (b) The application of quantum process tomography to a noisy two-qubit gate, specifically highlighting the phenomenon of crosstalk. The crosstalk effect, depicted as an interaction with the nearest qubit, is crucial for understanding and characterizing the intricate noise dynamics in multiqubit systems. (c) Utilizing singular value decomposition to extract the dominant noise tensor from the gate tensor.

Illustration of the conversion of the χ matrix to Kraus operators within a tensor network framework. The process begins with transforming the χ matrix into the Choi matrix via a basis transformation. Following this, an eigenvalue decomposition is performed on the Choi matrix, resulting in the derivation of Kraus operators. The different colors represent various summation operations involved in this transformation process.

Demonstration of the compilation of a cnot gate into a CZ gate, flanked by two RY gates.

Schematic of five-qubit QAOA circuit with nearest-neighbor interaction. The RZZ gates symbolize the ZZ interaction. This circuit design is scalable to larger qubit systems while maintaining the same structural framework. It is usually better to choose a circuit depth equal to half the number of qubits. Each independent qubit-interval CZ gate is individually characterized by QPT before operation.

Part of topology graph from the Baiwang device in the Quafu cloud platform.

Probability distributions from standard and QPT-MPDO simulations compared with demonstration result, utilizing Jensen-Shannon divergence for distribution distance measurement. Orange bars represent demonstration results from the Quafu cloud platform, blue bars represent simulations with a standard quantum noise model parameterized near physical parameters, and green bars show simulations using real noise QPT data from Quafu. The standard noise model inadequately mimics the actual system, showing a divergence of 2.68 × 10 − 2 , while the real noise model aligns more closely with demonstration outcomes with divergence of 3.64 × 10 − 4 . Minor discrepancies between QPT-MPDO simulation and demonstration data may arise due to the truncation processes in the tensor network framework.

Simulation results for the entangled quantum state generation using a QAOA circuit across varying qubit configurations. (a) Probability distribution for an ideal simulation with three qubits, establishing a reference case for this QAOA application for initial state | ψ 0 〉 = sin ( π / 8 ) | 0 〉 + cos ( π / 8 ) | 1 〉 . (b)–(d) Comparative analysis of ideal and noisy simulations for three-qubit and one-depth (b), five-qubit (c), and seven-qubit (d) systems. The main graphs track the loss function and state fidelity across simulation epochs, employing a gradient descent algorithm, with the shaded areas representing error bars. The insets highlight the respective bitstring probability distributions, reflecting the distinction between ideal and real-noise conditions.

Two example graphs for MaxCut problem.

Maximum cut problem on full-connection simulator. (a) Illustration of a four-qubit graph where the optimal partition {(0, 2), (1, 3)} results in all four edges being cut. (b) For a five-qubit graph, three optimal partitioning methods—{(2, 3), (0, 1, 4)}, {(0, 3), (1, 2, 4)}, and {(1, 4), (0, 2, 3)}—achieve a maximum of four-edge cuts. (c) Outcome of the Quantum Approximate Optimization Algorithm applied to the maximum cut problem, with results for qubits number Q = 4 and circuit depth p = 2 leading to quantum states | 0101 〉 and | 1010 〉 , indicating the grouping of qubits (vertices) into sets. For Q = 5 and p = 3 , the quantum states | 00110 〉 ,   | 01001 〉 ,   | 01101 〉 ,   | 11001 〉 ,   | 10110 〉 , and | 10010 〉 represent the corresponding vertex classifications as detailed in panel (b). The blue and orange bars denote the simulation outcomes with ideal quantum gates and with real gate noise, respectively.

Overview of quantum circuit composition and crosstalk handling. (a) Illustration of a randomized quantum circuit for three qubits (denoted as Q = 3 ). Each layer of the circuit comprises a set of single-qubit gates—randomly chosen from RX ,   RY , and RZ gates—followed by a layer of cnot gates representing entanglement operations. (b) Integration of a two-qubit gate into the simulation circuit, represented as a tensor, incorporates the effects of crosstalk by including interactions with the adjacent qubit. The bottom legend identifies the gate symbols used in the circuit diagram. It is readily applicable to experimental data; for instance, as shown in (b), one simply performs QPT on all qubits deemed to be influenced by noise and integrates this data into the simulation.

Fidelity of truncated quantum states under varying maximum bond dimension χ and maximum inner dimension κ along with error bars, calculated from 1000 random circuits for two scenarios: (a) qubit count Q = 3 and (b) qubit count Q = 6 . For smaller number of quantum bits or shallower quantum circuit depths, employing smaller χ and κ values enables highly faithful numerical simulation of real noise in quantum circuits. As the circuit depth and system size increase, greater noise and entanglement are introduced into the quantum system. Consequently, it becomes necessary to scale up the values of χ and κ proportionally to achieve sufficiently accurate approximations for simulating the scalable noisy system.

Sign up to receive regular email alerts from Physical Review A

  • Forgot your username/password?
  • Create an account

Article Lookup

Paste a citation or doi, enter a citation.

design of experiments review

  • 4K and 8K TV Reviews
  • Blu-ray Players
  • Cable/Satellite/OTA Products
  • Streaming Media Players/Apps
  • Outdoor TVs
  • Headphones & Earbud Reviews
  • AV Receivers
  • AV Preamplifiers
  • Audio Players
  • Streaming Music Services
  • Bookshelf Speaker Reviews
  • Floorstanding Speakers
  • In-wall and Architectural Speakers
  • Wireless Speakers
  • Security System Reviews
  • Remotes And System Controls
  • Smart Home Devices
  • Racks and Furniture
  • Sales & Deals
  • Featured News
  • Best Of Lists

Sony Bravia Theater Quad Review: The Soundbar-Free Experiment

design of experiments review

  • Daniel Glaser Daniel Glaser is the creator behind the YouTube channel NeverEnoughTech, which focuses on audio equipment reviews and recommendations, with a particular emphasis on soundbars. Known for his honest and sassy approach, Glaser provides viewers with candid advice to help them make informed purchasing decisions in the world of audio gear. His content aims to cut through marketing hype and offer practical insights for audio enthusiasts and everyday consumers alike.

Well, if you showed up, you're probably intrigued by Sony's soundbar-free, four-maybe-five-maybe-six piece 360 spatial sound mapping experiment. As a YouTuber who's been reviewing speakers, soundbars, and other audio products for four years, I've seen my fair share of bold claims with various levels of follow-thru. The Sony Bravia Theater Quad , priced similarly to a seasoned golf cart at $2,500, aims to deliver immersive sound without even giving you a soundbar. It's a significant price hike from its predecessor, the HT-A9, which you can now snag for a cool $1,500. But does this price jump come with a proportional leap in performance? Buckle up, audio enthusiasts, because we're about to dive deep into this soundbar-free experiment.

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? ed7ceb33 image

Key Takeaways

Let's lay out the basics faster than you can say "overpriced audio gear." The Sony Bravia Theater Quad will set you back $2,500 , a whopping $1,000 more than the HT-A9 . It boasts 16 drivers, up from the A9's 12, pumping out 504 watts of audio goodness through four wireless speakers. This bad boy supports all the fancy audio formats you'd expect: Dolby Atmos, DTS:X, IMAX Enhanced, and Sony's own 360 spatial audio. Plus, it's got 2.5 times stronger radio strength compared to the A9, which should mean fewer dropouts and a higher quality stream. Not too shabby for a soundbar-less wonder, eh?

Pros and Cons of the Sony Bravia Theater Quad

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? 55054101 image

Let's start with the good stuff, shall we? The Quad's 4.0.4-channel sound setup is no joke. It supports more audio formats than you can shake a remote at. The 16 drivers pack a serious punch, and the improved wireless connectivity means you're less likely to experience those annoying dropouts that plagued the A9 like a bad case of audio hiccups. Plus, it's got wide codec support, so whether you're streaming from your fancy new gaming console or dusting off your old Blu-ray player (Angry Birds 2 in IMAX Enhanced, anyone?), the Quad's got you covered.

Now, let's address the elephant in the room – that $2,500 price tag. Ouch. That's enough to make your wallet pinch your rear in protest. The limited connectivity options are also a bit of a head-scratcher. With only one HDMI input, you might find yourself playing a frustrating game of "which device gets to be connected today?" And if you've got furry friends, beware – those fabric-covered speakers might just become the world's most expensive cat scratching posts. Lastly, the lack of a built-in subwoofer might leave some bass heads feeling low … but not in a good way.

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? f9ea6996 image

Design and Physical Appearance

Speaker design.

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? cdfe4981 image

Sony's gone for a sleek, modern look with the Quad that screams "I'm not a speaker, I swear!" Each speaker is a rectangular box wrapped in what I can only describe as "living room material." It's almost square, a bit wider than tall, and could easily be mistaken for an air purifier, cat scratcher, or a modern art statement your rich friends will love.

Now, I've got to be honest – these speakers look too thin to be serious home theater contenders. They're smaller and lighter than Samsung 's much cheaper Music Frame, which makes you wonder if Sony's trying to pull a fast one on us. It's like they're trying to walk a tightrope between blending into your living room and being the baddest-sounding thing on the block. Let's hope they don't fall flat on their face.

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? 9e1fbcab image

Controller Box Changes

The control box got a glow-up too. It's bigger now, with a transparent plastic oval covering the display. Your kids with their eagle eyes can probably read it from 10 to 12 feet away. Sony's toned down the branding, swapping the gold logo for a matte black one and ditching the high-res sticker. 

Features and Setup

Setting up the Quad is easier than explaining to your significant other why you need a $2,500 speaker system. The Bravia Connect app guides you through the process like a patient audio sherpa.

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? b8cce741 image

However, I did run into some issues with the sound field optimization process. The app kept telling me the mic was blocked or there was too much background noise. I tried everything short of sending my family on a weekend vacation and draining the gas from all the lawn mowers in the neighborhood. Eventually, I just held the phone right up to the problem speaker, and voila! It worked. Not ideal, but hey, the show must go on.

The Subwoofer Question

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? 404529dc image

Now, let's talk about the elephant-sized subwoofer in the room. The Quad comes with an optional SA-SW3 or SA-SW5 subwoofer. The SA-SW5 is the one worth considering and will set you back another $700. Do you need it? You can pretend you don’t … but we can wait.

Without the sub, the Quad's bass performance is surprisingly good. Each speaker now has three drivers instead of two, including a bass-oriented woofer with a bass reflex port. This setup gives you a bit more oomph than you'd might expect - from carpet samples

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? 075ff84a image

Adding the SA-SW5 allows the system to be its best self. Suddenly, those tribal drums in "Black Panther" have real weight, and the bass drop in Skrillex's "Bangarang" might just force a dance party - stretch if its been a while.

My advice? Start with the Quad on its own. If you find yourself consistently wishing for more bass (or if you really want to annoy your downstairs neighbors), then consider adding the sub. This way, you can spread out the cost and really appreciate the difference the subwoofer makes.

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? 78fc3ff3 image

Performance

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? 854ae8f7 image

The Quad's performance is like that overachieving kid in class who not only aces the test but also brings the teacher an apple. Its 360 Spatial Sound Mapping creates an audio experience so immersive, you might forget you're in your living room and not actually on Arrakis fighting sand worms.

For movies, the Quad delivers a wide, enveloping soundstage that'll make you wonder if you've accidentally walked directly into the scene. The wide front separation is a sound enveloping magic trick that soundbars just can’t replicate. Sound effects are placed with surprising accuracy. And dialouge is surprisingly clear considering there is not center channel clear. Just don't expect it to get hospital-trip loud – you can get to impressive volumes, but your clothes will remain in place.

Music performance is surprisingly refreshing on the Quad. Remember how the A9 used to diss your '90s playlist? Well, the Quad's here to save the day. With its extra woofer in each speaker - It brings more dirt to the grungy stuff, making Pearl Jam's "Black" and Stone Temple Pilots' "Meat Plow" sound the way they were meant to. And don't even get me started on how it handles Pink Floyd's "Comfortably Numb" – with the chorus floating beautifully all around you in a way neither stereo nor soundbars can mimic.

Comparison to HT-A9

Having extensively tested both the A9 and the Quad, I can confidently say: the Quad is the superior sounding system, hands down. It's not even up for discussion. If you're starting fresh, go for the Quad. Yes, you'll feel the sting of that price tag, but once you set it up and start listening, you'll likely feel justified. And if you already own the A9? Well, selling it to upgrade might be worth considering if you're an enthusiast and a heavy user. Just don't end up like me, owning both systems – that's a level of audio addiction best avoided.

Sony's soundbar killer is updated for 2024. Is it worth the increased price??? baa84098 image

Final Thoughts

So, there you have it, folks. The Sony Bravia Theater Quad represents a bold step forward in home audio technology, challenging conventional home theater setups. At $2,500, it's a significant investment that delivers superior home theater performance without the complexity of traditional separates.

The Quad blows away soundbars and gives separates a run for their money, offering a level of audio immersion that must be heard to be believed. Its innovative 4.0.4-channel configuration, with 16 drivers and 504 watts of power, creates an expansive and precise soundstage that rivals much more expensive AVR and speaker combinations.

Versatility is a key strength - with support for Dolby Atmos, DTS:X, IMAX Enhanced, and Sony's 360 spatial audio, it handles virtually any audio format with ease. From nuanced dialogue to explosive action, the Quad delivers with a clarity and precision that sets a new standard for wireless speaker systems.

Is it perfect? No. Is it expensive? Hell yes. But if you're looking for a soundbar-free system that can deliver an immersive audio experience without blocking your TV, the Quad might just be your audio soulmate. It's a significant improvement over the A9, addressing many of its predecessor's shortcomings. While it has minor drawbacks like having just a single HDMI port, these are overshadowed by the Quad's overall performance. And bass enthusiasts may want to consider the additional SA-SW5 subwoofer, which further elevates the system's capabilities.

In recognition of its groundbreaking design and exceptional performance in conjunction with the editors at HomeTheaterReview, we're proud to award the Sony Bravia Theater Quad an Editor's Choice Award. This accolade reflects not just the system's technical prowess, but its potential to reshape how we think about home audio setups.

Related posts:

Early March Madness Soundbar Deals 2024

Subscribe To Home Theater Review

To spatialize or not to spatialize, fokus apollo: the new wireless headphones from noble with unique driver configuration, klipsch flexus core 300 debuts: the first sound bar with dirac live tech, beoplay h100: bang & olufsen's most luxurious headphones yet, soundboks go high-performance bluetooth speaker - portable and massive sound, kaleidescape strato v: 4k dolby vision and lossless audio for home theaters, pioneer vsx-lx805 review: a powerhouse av receiver for the discerning home theater enthusiast, denon home amp brings dolby audio and hdmi earc to your home, denon avr-a10h unveiled: a high-end 13.4-channel av receiver, why the svs sb-4000 is a top choice for home theater systems.

Advertise with Us Customer Support Editorial Policy Award Licensing

COMMENTS

  1. Designing the design of experiments (DOE)

    1. Introduction. Although developed primarily for agricultural purposes by British statistician Sir Ronald Fischer in the 1920s [1], the design of experiment (DOE) as a statistical method has been widely applied in different fields of science and industry, especially to support the design, development, and optimization of products and processes [2].The design of experiments includes a series ...

  2. Design of Experiments (DOE): Applications and Benefits in Quality

    One powerful tool used in quality control and assurance is the design of experiments (DOE). According to engineers and technologists, they often make use of DOE methodologies for various applications ranging from the design of new products, improvement of design, maintenance, control and improvement of manufacturing processes, maintenance and ...

  3. Design of Experiments

    Design of experiments (DoE) (Fisher 1935) is the formal process of designing an experimental protocol and analyzing the empirically collected data in order to discover valid and objective information about an underlying system (Montgomery 2008).This definition is deliberately vague because of the very wide applicability of this approach. The term design of experiments can be interpreted as the ...

  4. Design of computer experiments: A review

    Introduction. The design of experiments (DoE) is a procedure to plan and define the conditions for performing controlled experimental trials. Its earliest footprints can be traced back to the era of the old testament.The first known example of DoE predates to the second century BC in the first chapter of the Book of Daniel (Stigler, 1974). In this text, Daniel showed the superiority of his ...

  5. Fundamentals of Experimental Design: Guidelines for Designing ...

    The last pillar of experimental design is the least understood and possesses the least amount of theoretical results to support empirical observations. This subject receives little or no coverage in the textbooks that deal with experimental design (e.g., less than one page in Steel et al., 1996), perhaps owing to the lack of theoretical results ...

  6. Design of computer experiments: A review

    Abstract. In this article, we present a detailed overview of the literature on the design of computer experiments. We classify the existing literature broadly into two categories, viz. static and adaptive design of experiments (DoE). We begin with the abundant literature available on static DoE, its chronological evolution, and its pros and cons.

  7. Design of Experiments: An Overview and Future Paths

    Abstract. Process optimization is increasingly important as competition deepens. As such, the optimization of process parameters allows the manufacturer to improve its competitive advantage. In that context, planning experiences to test the limits of the system is progressively more important. In that context, Design of Experiments is a group ...

  8. A Design of Experiments (DoE) Approach Accelerates the ...

    Design of experiments (DoE) is a statistical approach to process optimization that is used across a variety of industries. ... the numerous practical and scientific advantages of the DoE approach ...

  9. (PDF) Design of experiments application, concepts, examples: State of

    Abstract and Figures. Design of Experiments (DOE) is statistical tool deployed in various types of system, process and product design, development and optimization. It is multipurpose tool that ...

  10. Design of experiments

    The design of experiments (DOE or DOX), also known as experiment design or experimental design, is the design of any task that aims to describe and explain the variation of information under conditions that are hypothesized to reflect the variation. The term is generally associated with experiments in which the design introduces conditions that ...

  11. Design of Experiments and machine learning for product innovation: A

    Design of Experiments (DOE) is a statistical method, which guides the execution of experiments, which in turn are analyzed to detect the relevant variables and optimize the process or phenomenon under investigation. 4 The use of DOE in product innovation (PI) can result in products that are easier and cheaper to manufacture, that have enhanced ...

  12. PDF DESIGN OF EXPERIMENTS (DOE) FUNDAMENTALS

    S (DOE)FUNDAMENTALSLearning ObjectivesHave a broad understanding of the role that design of experiments (DOE) plays in the success. l completion of an improvement project.Understand. w to construct a design of experiments.Understan. how to analyze a design of experiments.Understand how to interpre. the results of a.

  13. Design of Experiments (DoE) and Process Optimization. A Review of

    This review summarizes selected publications from Organic Process Research & Development using DoE to show how processes can be optimized efficiently and how DoE findings may be applied to scale-up. Statistical design of experiments (DoE) is a powerful tool for optimizing processes, and it has been used in many stages of API development. This review summarizes selected publications from ...

  14. Design of Experiments (DoE) and Process Optimization. A Review of

    Statistical design of experiments (DoE) is a powerful tool for optimizing processes, and it has been used in many stages of API development. This review summarizes selected publications from ...

  15. What Is Design of Experiments (DOE)?

    Quality Glossary Definition: Design of experiments. Design of experiments (DOE) is defined as a branch of applied statistics that deals with planning, conducting, analyzing, and interpreting controlled tests to evaluate the factors that control the value of a parameter or group of parameters. DOE is a powerful data collection and analysis tool ...

  16. 5: Experimental Design

    Experimental design is a discipline within statistics concerned with the analysis and design of experiments. Design is intended to help research create experiments such that cause and effect can be established from tests of the hypothesis. We introduced elements of experimental design in Chapter 2.4. Here, we expand our discussion of ...

  17. Design of Experiments

    Design of experiments applied to lithium-ion batteries: A literature review. L.A. Román-Ramírez, J. Marco, in Applied Energy, 2022 Highlights • The Design of Experiments methodology and statistical analysis is introduced.. Design of experiments is a valuable tool for the design and development of lithium-ion batteries.

  18. State-of-the-Art Review of Design of Experiments for Physics-Informed

    This paper presents a comprehensive review of the design of experiments used in the surrogate models. In particular, this study demonstrates the necessity of the design of experiment schemes for the Physics-Informed Neural Network (PINN), which belongs to the supervised learning class. Many complex partial differential equations (PDEs) do not have any analytical solution; only numerical ...

  19. Guide to Experimental Design

    Table of contents. Step 1: Define your variables. Step 2: Write your hypothesis. Step 3: Design your experimental treatments. Step 4: Assign your subjects to treatment groups. Step 5: Measure your dependent variable. Other interesting articles. Frequently asked questions about experiments.

  20. Design of Experiments: A Modern Approach, 1st Edition

    Design of Experiments: A Modern Approach introduces readers to planning and conducting experiments, analyzing the resulting data, and obtaining valid and objective conclusions. This innovative textbook uses design optimization as its design construction approach, focusing on practical experiments in engineering, science, and business rather than orthogonal designs and extensive analysis.

  21. State-of-the-Art Review of Design of Experiments for Physics-Informed

    This paper presents a comprehensive review of the design of experiments used in the surrogate mod-els. In particular, this study demonstrates the necessity of the design of experiment schemes for the Physics-Informed Neural Network (PINN), which belongs to the supervised learning class. Many

  22. Design and Analysis of Experiments

    1.3 Completely Randomized Design (CRD) 1.3.1 Example - Anthocyanin Extractability in Cabernet Franc Grapes; 1.4 Randomized Complete Block Design (RCBD or RBD) 1.4.1 Example - Comparison of 4 Treadmill Models for User Satisfaction; 1.5 Overview of Some Standard Experimental Designs. 1.5.1 Example - Advertising Messaging Strategy and Attitude to ...

  23. What is DOE? Design of Experiments Basics for Beginners

    Using Design of Experiments (DOE) techniques, you can determine the individual and interactive effects of various factors that can influence the output results of your measurements. You can also use DOE to gain knowledge and estimate the best operating conditions of a system, process or product. DOE applies to many different investigation ...

  24. Going beyond the comparison: toward experimental instructional design

    To design effective instruction, educators need to know what design strategies are generally effective and why these strategies work, based on the mechanisms through which they operate. Experimental comparison studies, which compare one instructional design against another, can generate much needed evidence in support of effective design strategies. However, experimental comparison studies are ...

  25. Design of Experiments (DoE) and Process Optimization. A Review of

    Statistical design of experiments (DoE) is a powerful tool for optimizing processes, and it has been used in many stages of API development. This review summarizes selected publications from Organic Process Research & Development using DoE to show how processes can be optimized efficiently and how DoE findings may be applied to scale-up.

  26. A Quantitative Systematic Literature Review of Combination Punishment

    This review evaluated single-case experimental design research that examined challenging behavior interventions utilizing punishment elements. Thirty articles published between 2013 and 2022 met study inclusion criteria. Study quality was also assessed.

  27. A scoping review of life design intervention research: Implications for

    The recently developed paradigm in career counselling known as life design has caused a proliferation of new interventions. A scoping study was performed to provide an overview of empirical support for the effectiveness of these interventions. Twelve articles that evaluate the efficacy of eight interventions were found. Interventions included individual and group forms of the Career ...

  28. Phys. Rev. A 110, 032604 (2024)

    In recent years, efficient quantum circuit simulations incorporating ideal noise assumptions have relied on tensor network simulators, particularly leveraging the matrix product density operator (MPDO) framework. However, experiments on real noisy intermediate-scale quantum (NISQ) devices often involve complex noise profiles, encompassing uncontrollable elements and instrument-specific effects ...

  29. Sony Bravia Theater Quad Review: The Soundbar-Free Experiment

    Key Takeaways. Let's lay out the basics faster than you can say "overpriced audio gear." The Sony Bravia Theater Quad will set you back $2,500, a whopping $1,000 more than the HT-A9.It boasts 16 ...

  30. A comprehensive review of Design of experiment (DOE) for water and

    A review on design of experiments and surrogate models in aircraft real-time and many-query aerodynamic analyses • Discusses contextualized examples of DOE used specifically in Aircraft modeling • Use the concept of surrogate-based model approach as a rapid and cheap simulation means