• Accreditation
  • Value of Accreditation
  • Standards and Process
  • Search Accredited Schools

Bright green megaphone icon against dark teal weave pattern

  • Educational Membership
  • Business Membership
  • Find a Member

AACSB Business Education Alliance logo

  • Learning and Events
  • Conferences
  • Webinars and Online Courses

Las Vegas Skyline at Sunset

  • All Insights
  • B-School Leadership
  • Future of Work
  • Societal Impact

Four colored boxes with different icons: notebook, camera lens, compass, and binoculars

  • Leadership and Governance
  • Media Center

AACSB. Business Education, Connected.

  • Accredited School Search
  • Advertise, Sponsor, Exhibit
  • Tips and Advice
  • Is Business School Right for Me?

The Importance of Variance Analysis

Article Icon

This critical topic is too often taught to only a handful of students—or neglected in the b-school curriculum altogether.

Variance analysis is an essential tool for business graduates to have in their toolkits as they enter the workforce. Over our decades of experience in executive education, we’ve observed that managers across all industries and functions use variance analysis to measure the ability of their organizations to meet their commitments.

Because variance analysis is such a powerful risk management tool, there is a strong case for including it in the finance portion of any MBA curriculum. Yet fewer than half of finance professors believe they should be teaching this subject; they view it as a topic more typically taught in accounting classes. At the same time, in practice, variance analysis is such a cross-functional tool that it could be taught throughout the business school curriculum—but it’s not. We perceive a worrisome disconnect between the way variance analysis is taught and the way it is used in real life.

Variance Analysis and Its Applications

There are three periods in the life of a business plan: prior period to plan, plan to actual, and prior period to actual. For instance, if a business plan is being formulated for 2019, the “prior period” would be 2018, the “plan to actual” would be the budget for 2019, and the “prior period to actual” would be what really happens in 2019. These three stages are also referred to as planning, meeting commitments, and growth.

For each of these periods, variance analysis looks at the deviations between the targeted objective and the actual outcome. The most common variances are found in price, volume, cost, and productivity. When executives conduct an operational review, they will need to explain why there were positive or negative variances in any of these areas. For instance, did the company miss a target because it lost an anticipated national account or failed to lock in a price contract due to competitive pressure?

Executives who understand variances will improve their risk management, make better decisions, and be more likely to meet commitments. In the process, they’ll produce outcomes that can give an organization a real competitive advantage and, ultimately, create shareholder value.

Most businesses apply variance analysis at the operating income level to determine what they projected and what they achieved. The variances usually are displayed in the form of floating bar charts—also known as walk, bridge, or waterfall charts. These graphics are often used in internal corporate documents as well as in investor-facing documents such as quarterly earnings presentations.

While variance analysis can be applied in many functional areas, it is used most often in finance-related fields. Yet, the majority of finance programs at both the graduate and undergraduate levels don’t cover it at all. We surveyed finance faculty in 2013 and accounting faculty in 2017 to determine how they teach and use variance analysis. Among other things, we learned that:

  • More than 80 percent of accounting faculty believe that variance analysis is important to a finance career, and they are far more likely to teach it than their finance faculty colleagues.
  • Only 59 percent of finance faculty and 48 percent of accounting faculty are familiar with examples of walk charts from real-world companies. Yet these visual portrayals of operating margin variances are commonplace in quarterly earnings presentations and readily found on investor relations websites.

Because universities mostly fail to teach this important topic, corporate educators have been left to fill the learning gap. Many global organizations, in fact, make variance analysis a key subject in their development programs for entry-level financial professionals.

The University Response

We believe it’s critical for universities to better align their curricula with the skills that today’s employers seek in the graduates they hire. Not only do we think variance analysis should be included in the business curriculum, but we could even make an argument for running it as a capstone business course. We offer these suggestions for ways that faculty could integrate this powerful tool across the business school program:

  • Both accounting and finance faculty should, as much as is practical, incorporate variance analysis into their classes, particularly focusing on financial planning and analysis. We acknowledge that a dearth of corporate finance texts on the topic will make this a challenge for finance professors. The two of us employ teaching materials in our graduate business and undergraduate finance classes based on experience in the corporate world, and we would be glad to share them with others.
  • Faculty who use case studies should always include a case specific to variance analysis tools. Students who pursue careers in corporate finance will almost certainly be required to use such tools, particularly as data and predictive analytics applications are enhanced to improve forecasting accuracy. Two sources of such case studies are TRI Corporation and Harvard Business Publishing.
  • Professors can introduce students to real-world applications of variance analysis by showing how it is used in investor relations (IR) pitches. As instructors, the two of us routinely search IR sites for applications of variance analysis. We specifically look for operating margin variance walks (floating bars, brick charts) for visual applications that can make the topic come to life for students. Here’s an example from Ingersoll Rand:

Ingersoll Rand variance analysis chart

  • Faculty from accounting and finance programs should collaborate on when, where, and how to teach variance analysis. At the very least, this will ensure that students gain an understanding of the topic from either a finance or an accounting perspective, but the ideal would be for them to benefit from both perspectives for a holistic understanding. At Fairfield University, accounting programs introduce students to the theory of variance analysis. Then finance programs take an operational and cross-functional approach that addresses planning, meeting commitments, and growth.
  • Both accounting and finance faculty should help finance majors understand variance analysis from a practitioner’s standpoint. Discussions about pricing, supply chain, manufacturing costs, risk management, and inflation and deflation around cost inputs can help students grasp the necessity of making trade-offs and balancing short-term and long-term business goals. To make sure students understand the practitioner’s viewpoint, we use corporate business simulations that are more operationally focused, as opposed to being academic in tone.
  • To extend the topic to all majors, not just finance and accounting students, faculty from disciplines such as strategy and operations could also incorporate variance analysis into their classes. For instance, if they use business simulations for their capstone courses, they could add a component that covers variance analysis. At Fairfield, we use a variety of competitive business simulations from the corporate world.
  • Finally, professors can bring in guest speakers from almost any business functional area and ask them to explain, as part of their presentations, how variance analysis is relevant in their fields. As an example, we often have senior finance executives from Stanley Black & Decker—a company known well-known for its ability to grow and meet its commitments via variance analysis—present to our graduate program. We tap other companies from Fairfield County as well.

In the graduate classes we teach at Fairfield University, we have always tried to connect theory with practice. And we’ve long believed that creating a culture of meeting and exceeding commitments requires aligning interaction across functions in the workplace. With this article, we hope that, at the very least, we can start a larger discussion about the need for cross-disciplinary teaching of variance analysis.

  For more about variance analysis materials, contact us at  [email protected]  or  [email protected] .

Video Icon

  • What is Variance Analysis: Types, E...

What is Variance Analysis: Types, Examples and Formula

blog-23

Table of Content

fin-logo

Join Our 100,000+ Community

Sign up for latest finance stories.

fin-logo

Key Takeaways

  • Variance analysis compares the actual vs expected cash flows and keeps track of the financial metrics of your businesses. 
  • Different variance analysis formula measures specific financial metrics, providing insights into specific aspects of performance. 
  • Leveraging AI capabilities to analyze differences helps stakeholders achieve a better understanding of the finances and make well-informed decisions.

keytakeway

Introduction

In any business, having a grasp of projected cashflows, and available cash is crucial for daily financial operations. Enterprises utilize variance to measure the disparity between expected and actual cash flow.

Variance analysis involves assessing the reasons for the variances and understanding their impact on financial performance. In this ever-changing global economy, variance is an important metric for enterprises to track more than ever as it helps understand how accurate your cash forecasts are and whether you need to adjust your financial plans or take corrective actions to survive in the ever-changing volatile business environment. 

By the end of this blog, you will be able to understand variance analysis, its importance, and how to calculate it so you can leverage the cash properly and make strategic and informed business decisions.

What is Variance Analysis?

Variance analysis measures the difference between the forecasted cash position and the actual cash position. A positive variance occurs when actual cash flow surpasses the forecasted amount, while a negative variance indicates the opposite. Variance analysis helps you understand where you went over or under budget and why. 

This analysis provides insights into budget deviations and their underlying causes. It holds significance by enabling financial performance monitoring, trend identification, and informed decision-making for future planning. Through variance analysis, you can stay aligned with financial objectives and progressively enhance your profitability.

Types of Variance Analysis

Different types of variances can occur in the cash forecasting process due to reasons such as changes in market scenarios, customer behavior, and timing issues, among other factors. These variances can impact both sales revenue and expenses. By understanding the core impacts of these variances, companies can make necessary adjustments to their budgets, mitigate risks, and improve their overall financial performance.

Broadly, variances can be classified into two major categories:

  • Materials, Labor, and Variable Overhead Variances
  • Fixed Overhead Variances

 Types of Variance Analysis

Materials, labor, and variable overhead variances

These include price/rate variances and efficiency and quantity variances. Price/rate variances show the differences between industry-standard costs and actual pricing for materials, while efficiency variances and quantity variances refer to the differences between actual input values and the expected input values specified. This analysis plays a crucial role in managing procurement costs, making informed decisions, optimizing cost structures, and maintaining positive cash flow.

Fixed overhead variances 

Fixed overhead variances include volume variances and budget variances. Volume variances measure the difference between the actual revenue and budgeted revenue that is derived solely from changes in sales volume. Meanwhile, budget variances indicate the differences between actual and budgeted amounts. These variances help businesses understand the influence of sales volume fluctuations on financial performance, provide insights into the effectiveness of financial planning , and identify areas of overperformance or underperformance. 

Budget variances 

Budget variances can be divided into two subgroups: expense variances and revenue variances. Expense variance measures actual costs compared to the budgeted costs while revenue variances measure actual revenue with the budgeted revenue. Positive revenue variances represent revenue that exceeds the expected revenue, while negative revenue variances represent lower expected revenue.

Budget variance analysis are important to understand the reasons behind the deviations from the budgeted amounts. It enables the identification of avenues for enhancing business processes, boosting revenue, and cutting costs. By examining revenue variances, you can uncover possibilities for long-term efficiency improvements and increased business value.

Let’s take a look at an example of variance in budgeting 

Let’s say that your enterprise sells gadgets, and you’ve projected that you’ll sell $1 million worth of gadgets in the next quarter. However, at the end of the quarter, you find that you’ve only sold $800,000 worth of gadgets. That’s a variance of $200,000, or 20% of your original plan. By analyzing this variance, you can figure out what went wrong and take steps to improve your sales performance in the next quarter. Here, variance analysis becomes the vital tool that enables you to quickly identify such changes and adjust your strategies accordingly to manage your financial performance and optimize cash forecasting .

highradius banner

Role of Variance Analysis 

In periods of market instability, your business could face unforeseen fluctuations in revenue, costs, or other financial indicators. In such cases, one of the most crucial tools in your financial management system is variance analysis.

Variance analysis allows you to track the financial performance of your organization and implement proactive measures to decrease risks and enhance financial health. It enables businesses to compare their expected cash flow with their actual cash flow and to identify the root reasons for any discrepancies. Businesses can acquire an important understanding of their cash flow performance and decide on appropriate actions in response to fluctuating market conditions.

For instance, If a company realizes its cash inflows are lower, it can cut costs or alter its pricing strategy to stay profitable. Likewise, if its real cash outflows exceed because of unforeseen costs, it can modify its financial plan or explore other funding choices.

Variance Analysis Formula 

The key components of variance are relatively straightforward; actuals vs. expected. Let’s look into the key variance analysis formula that focuses on specific financial metrics. These formulas unveil gaps between expected and actual results, providing insights into specific aspects of performance. 

Cost variance formula

Cost variance measures the difference between actual costs and budgeted costs. The cost variance formula is:

Cost Variance = Actual Costs – Budgeted Costs

This formula helps identify cost control issues, inefficiencies, and opportunities for improvement.

Efficiency variance formula

Efficiency variance measures the difference between actual input values (e.g., labor hours, machine hours) and budgeted or standard input values. The efficiency variance formula is:

Efficiency Variance = (Actual Input – Budgeted Input) × Standard Rate

This formula helps organizations identify variations in productivity and pinpoint areas for improvement.

Volume variance formula

Volume variance, also known as sales volume variance, measures the impact of changes in sales volume on revenue compared to the budgeted volume. The variance volume formula is: 

Volume Variance = (Actual Sales Volume – Budgeted Sales Volume) × Budgeted Selling Price

This formula helps organizations to understand the contribution of sales volume to revenue performance.

Budget variance formula

Budget variance measures the actual revenue with the budgeted revenue. The budget variance formula is:

Budget Variance = Actual Revenue – Budgeted Revenue

This formula aids in evaluating pricing strategies, market demand, and sales effectiveness.

Examples of Variance Analysis 

For instance, let’s consider, a company that plans to create a new mobile app with a projected cost of $50,000. The expected timeline for completion is 4 months, with a budgeted labor cost of $10,000 per month. The target is to release the application with 10 key features. Here are the examples that demonstrate different types of variances under this scenario:

Cost variance:

During the development process, the company implements cost-saving measures and efficient resource allocation, resulting in lower actual costs. The actual cost of the project at completion is $45,000. The cost variance can be calculated as follows:

Cost Variance = $45,000 – $50,000 = -$5,000

Here, the negative cost variance of -$5,000 indicates that the company has achieved cost savings of $5,000 compared to the budgeted cost for the project.

Efficiency variance:

The project is efficiently managed, and the team completes the development in 3.5 months instead of the budgeted 4 months. Assuming a budgeted labor cost of $10,000 per month, the efficiency variance can be calculated as follows:

Efficiency Variance = (3.5 months – 4 months) × $10,000 = -$5,000

The negative efficiency variance of -$5,000 indicates that the project was completed ahead of schedule, resulting in labor cost savings of $5,000.

Volume variance:

The final version of the mobile application is released with 12 key features instead of the budgeted 10 features. Assuming a budgeted revenue of $2,000 per feature, the volume variance can be calculated as follows:

Volume Variance = (12 features – 10 features) × $2,000 per feature = $4,000

The positive volume variance of $4,000 indicates that the company delivered additional features, resulting in increased revenue of $4,000 compared to the budgeted amount.

Budget variance:

The company spent $8,000 on marketing and promotional activities for the mobile application launch, while the budgeted amount was $10,000. The budget variance can be calculated as follows:

Budget Variance = $8,000 – $10,000 = -$2,000

The negative budget variance of -$2,000 indicates that the company spent $2,000 less than the budgeted amount for marketing and promotional activities.

In these scenarios, the company achieved cost savings, enhanced efficiency, delivered additional features, and spent less than the budgeted amount on marketing expenses. These variances provide insights into cost management, efficiency, revenue generation, and budget adherence within the given software development project scenario.

Benefits of Conducting Variance Analysis

Let’s take a look at the top 4 benefits enterprises can reap by conducting variance analysis for cash forecasting:

highradius

Identify discrepancies

Variance analysis helps identify discrepancies between the actual cash inflows and outflows and the forecasted amounts. By comparing the forecasted cash flow with the actual cash flow, it is easier to identify any discrepancies, enabling the stakeholders to take corrective measures. 

Refine cash forecasting techniques

Conducting variance analysis allows for a review of past forecasts to identify any errors or biases that may have impacted accuracy. This information can be used to refine forecasting techniques, improve future forecasts make adjustments to existing forecast templates, or build new ones.

Improve financial decision-making

Understanding the reasons for variances can provide valuable insights that can help improve financial decision-making, which is critical in a volatile market. For example, if a variance is caused by unexpected expenses, management may decide to reduce expenses or explore cost-saving measures.

Better cash management

By analyzing variances, companies can identify areas where cash management can be improved. This can include better management of accounts receivable or accounts payable, more effective inventory management, or renegotiating payment terms with the suppliers.

Role of AI in Variance Analysis for Cash Forecasting

Amid turbulent market conditions, as companies prepare for 2024 and beyond, enterprises’ finance chiefs professionals are recommending various enhancements to improve decision-making. The most commonly mentioned improvements are the adoption of digital technologies, AI, and automation, and the enhancement of forecasting, scenario planning, and consistency in measuring key performance indicators, as per the Deloitte CFO Signals Survey .

This goes to show the significance of the adoption of advanced technologies, such as AI, for companies preparing for uncertain markets. The challenge with traditional variance analysis is that it is difficult for treasurers to create low-variance cash flow forecasts for enterprises as they utilize manual methods and spreadsheets while dealing with large volumes of data. Moreover, relying on manual variance reduction approaches leads to high variance and can be time-consuming, labor-intensive, and expensive, thus, delaying the decision-making process.

Here’s how AI addresses this challenge and enables it to take variance analysis to the next level: 

  • AI-based cash forecasting software helps in variance analysis by taking additional steps to improve the accuracy of the cash forecast by 90-95%. 
  • It enables organizations to continuously improve the forecast by understanding the key drivers of variance. 
  • It compares cash forecasts to actual results to check for variances, aligning the forecast with other aspects such as monthly, quarterly, and yearly forecasts, thus, ensuring that the forecast is accurate across various scenarios.
  • AI also analyzes the accuracy of cash forecasts through a line item analysis across multiple horizons and makes tweaks to the algorithm through an AI-assisted review process. 
  • Finally, AI fine-tunes the forecast model and enhances the data as needed to achieve the desired level of forecast accuracy. 

highradius

Benefits of Leveraging AI in Variance Analysis

Here are some of the key benefits you can achieve by conducting variance analysis with AI in cash forecasting:

Benefits of Leveraging AI in Variance Analysis

Automated reporting

AI can streamline the process of reporting discrepancies in cash flow by delivering consistent reports that emphasize developments and regularities. Automating the process of reporting allows organizations to save time and resources that would otherwise have been spent on manual reporting. This also guarantees consistent and accurate reporting, removing the chance of human mistakes. By receiving frequent updates on discrepancies in cash flow as they occur, you can effectively monitor your business’s cash flow and pinpoint opportunities for enhancement to optimize your financial results.

Faster, data-driven decision-making

AI can assist in making quicker, better-informed decisions about managing cash flow by providing in-depth insights on cash forecasts in real time. This can assist companies in promptly addressing fluctuations in cash flow and implementing necessary measures. This is especially crucial in periods of market volatility when cash flow trends can quickly fluctuate and unforeseen circumstances may arise.

Real-time cash analysis & better liquidity management 

With AI at its core, cash flow forecasting software can learn from industry-wide seasonal fluctuations to improve forecasting accuracy. AI-powered cash forecasting software that enables variance analysis can also create snapshots of different forecasts and variances to compare them for detailed, category-level analysis. Offering such comprehensive visibility, helps you respond quickly to changes in cash flow, take corrective action as needed, and manage your enterprise’s liquidity better. 

Improved cash forecasting accuracy with real-time cash analysis

AI streamlines your examination of cash flow by delving deeply into and analyzing a large volume of data from various sources, such as past cash flow information, market trends, and economic indicators, in real time. Therefore, it allows for an immediate understanding of discrepancies in cash flow. This can offer a more in-depth assessment of cash flow discrepancies, enabling the recognition of trends and patterns that may not be visible through manual review.

highradius

How HighRadius Can Help Automate Variance Analysis in Cash Forecasting? 

In a rapidly evolving business landscape, market uncertainties and disruptions can have a significant impact on an enterprise’s financial stability. That’s why having a robust cash forecasting system with AI at its core is essential for businesses to conduct automated variance analysis. HighRadius’ cash forecasting software enables more advanced and sophisticated variance analysis that helps you achieve up to 95% global cash flow forecast accuracy. 

By leveraging its AI capabilities in data analysis, pattern recognition, real-time integration, and predictive modeling, it empowers finance teams to gain deeper insights, improve accuracy, and make more informed decisions to manage cash flow effectively. Furthermore, our solution helps continuously improve the forecast by understanding the key drivers of variance. The AI algorithm learns from historical data and feedback, continuously improving their accuracy and effectiveness over time. This iterative learning process enhances the quality of variance analysis results. 

Our AI-based cash forecasting solution supports drilling down into variances across various cash flow categories, geographies, and entity-level variances performing a root cause analysis, and helps achieve up to 98% automated cash flow category tagging. 

highradius

1) What is the difference between standard costing and variance analysis?

Standard costing is setting an estimated (standard) cost on metrics such as input values, materials, cost of labor, and overhead based on industrial trends and historical data. Variance analysis focuses on analyzing and interpreting differences (variances) between actual costs and standard costs.

2) What are the three main sources of variance in an analysis?

In variance analysis, the three main sources of variance are material variances (differences in material usage or cost), labor variances (variations in labor productivity or wage rates), and overhead variances (deviations in overhead costs).

3) What is P&L variance analysis?

P&L (profit & loss) variance analysis is the process of comparing actual financial results to expected results in order to identify differences or variances. This type of variance analysis is typically performed on a company’s income statement, which shows its revenues, expenses, and net profit or loss over a specific period of time. 

4) Why is the analysis of variance important?

The analysis of variance is important to keep track of as it tells about the financial health of your business. With proper variance analysis, you can measure the financial performance of your business, keep track of over and under-performing financial metrics, and identify areas for improvement.

Related Resources

How to Improve Cash Flow: Top 12 Strategies

Cash Forecasting for Mid-Sized Businesses

Cash Forecasting for Mid-Sized Businesses

What is Cash Flow Planning: Benefits, Importance, and Types

What is Cash Flow Planning: Benefits, Importance, and Types

Streamline your treasury operations with highradius.

Automate manual processes, generate accurate forecasts, reduce errors, and gain real-time visibility into your cash position to maximize your cash flow.

The HighRadius™ Treasury Management Applications consist of AI-powered Cash Forecasting Cloud and Cash Management Cloud designed to support treasury teams from companies of all sizes and industries. Delivered as SaaS, our solutions seamlessly integrate with multiple systems including ERPs, TMS, accounting systems, and banks using sFTP or API. They help treasuries around the world achieve end-to-end automation in their forecasting and cash management processes to deliver accurate and insightful results with lesser manual effort.

Please fill in the details below

Scroll-Top

Get the hottest Accounts Receivable stories

Delivered straight to your inbox.

  • Order To Cash
  • Collections Management
  • Cash Application Management
  • Deductions Management
  • Credit Management
  • Electronic Invoicing
  • B2B Payments
  • Payment Gateway
  • Surcharge Management
  • Interchange Fee Optimizer
  • Payment Gateway For SAP
  • Record To Report
  • Financial Close Management
  • Account Reconciliation
  • Anomaly Management
  • Accounts Payable Automation
  • Treasury & Risk
  • Cash Management
  • Cash Forecasting
  • Treasury Payments
  • Learn & Transform
  • Whitepapers
  • Courses & Certifications
  • Why Choose Us
  • Data Sheets
  • Case Studies
  • Analyst Reports
  • Integration Capabilities
  • Partner Ecosystem
  • Speed to Value
  • Company Overview
  • Leadership Team
  • Upcoming Events
  • Schedule a Demo
  • Privacy Policy

HighRadius Corporation 2107 CityWest Blvd, Suite 1100, Houston, TX 77042

We have seen financial services costs decline by $2.5M while the volume, quality, and productivity increase.

Colleen Zdrojewski

Colleen Zdrojewski

Trusted By 800+ Global Businesses

highradius

principlesofaccounting.com

default-logo

Variance Analysis

  • Goals Achievement
  • Fill in the Blanks
  • Multiple Choice

Standard costs provide information that is useful in performance evaluation. Standard costs are compared to actual costs, and mathematical deviations between the two are termed variances. Favorable variances result when actual costs are less than standard costs, and vice versa. The following illustration is intended to demonstrate the very basic relationship between actual cost and standard cost. AQ means the “actual quantity” of input used to produce the output. AP means the “actual price” of the input used to produce the output. SQ and SP refer to the “standard” quantity and price that was anticipated. Variance analysis can be conducted for material, labor, and overhead.

Variance Analysis Illustration

Direct Material Variances

Direct Material Variance Illustration

  • Materials Price Variance : A variance that reveals the difference between the standard price for materials purchased and the amount actually paid for those materials [(standard price – actual price) X actual quantity].
  • Materials Quantity Variance : A variance that compares the standard quantity of materials that should have been used to the actual quantity of materials used. The quantity variation is measured at the standard price per unit [(standard quantity – actual quantity) X standard price].

Note that there are several ways to perform the intrinsic variance calculations. One can compute the values for the red, blue, and green balls and note the differences. Or, one can perform the algebraic calculations for the price and quantity variances. Note that unfavorable variances (negative) offset favorable (positive) variances. A total variance could be zero, resulting from favorable pricing that was wiped out by waste. A good manager would want to take corrective action, but would be unaware of the problem based on an overall budget versus actual comparison.

Railing image

Blue Rail measures its output in “sections.” Each section consists of one post and four rails. The sections are 10’ in length and the posts average 4’ each. Some overage and waste is expected due to the need for an extra post at the end of a set of sections, faulty welds, and bad pipe cuts. The company has adopted an achievable standard of 1.25 pieces of raw pipe (50’) per section of rail. During August, Blue Rail produced 3,400 sections of railing. It was anticipated that pipe would cost $80 per 40’ piece. Standard material cost for this level of output is computed as follows:

Standard Material Cost

The production manager was disappointed to receive the monthly performance report revealing actual material cost of $369,000. A closer examination of the actual cost of materials follows.

Actual Material Cost

The total direct material variance was unfavorable $29,000 ($340,000 vs. $369,000). However, this unfavorable outcome was driven by higher prices for raw material, not waste, as follows:

MATERIALS PRICE VARIANCE

(sp – ap) x aq = ($80 – $90) x 4,100, = <$41,000>.

Materials usage was favorable since less material was used (4,100 pieces of pipe) than was standard (4,250 pieces of pipe). This resulted in a favorable materials quantity variance:

MATERIALS QUANTITY VARIANCE

(sq – aq) x sp = (4,250 – 4,100) x $80, journal entries.

A company may desire to adapt its general ledger accounting system to capture and report variances. Do not lose sight of the very simple fact that the amount of money to account for is still the money that was actually spent ($369,000). To the extent the price paid for materials differs from standard, the variance is debited (unfavorable) or credited (favorable) to a Materials Price Variance account. This results in the Raw Materials Inventory account carrying only the standard price of materials, no matter the price paid:

Raw Materials Journal Entry

Work in Process is debited for the standard cost of the standard quantity that should be used for the productive output achieved, no matter how much is used. Any difference between standard and actual raw material usage is debited (unfavorable) or credited (favorable) to the Materials Quantity Variance account:

Work in Process Journal Entry

The price and quantity variances are generally reported by decreasing income (if unfavorable debits) or increasing income (if favorable credits), although other outcomes are possible. Examine the following diagram and notice the $369,000 of cost is ultimately attributed to work in process ($340,000 debit), materials price variance ($41,000 debit), and materials quantity variance ($12,000 credit). This illustration presumes that all raw materials purchased are put into production. If this were not the case, then the price variances would be based on the amount purchased while the quantity variances would be based on output.

Direct Materials Variance Illustration

Direct Labor Variances

Direct Labor Variance Illustration

In this illustration, AH is the actual hours worked, AR is the actual labor rate per hour, SR is the standard labor rate per hour, and SH is the standard hours for the output achieved.

The Total Direct Labor Variance consists of:

  • Labor Rate Variance : A variance that reveals the difference between the standard rate and actual rate for the actual labor hours worked [(standard rate – actual rate) X actual hours].
  • Labor Efficiency Variance : A variance that compares the standard hours of direct labor that should have been used to the actual hours worked. The efficiency variance is measured at the standard rate per hour [(standard hours – actual hours) X standard rate].

As with material variances, there are several ways to perform the intrinsic labor variance calculations. One can compute the values for the red, blue, and green balls. Or, one can perform the noted algebraic calculations for the rate and efficiency variances.

Recall that Blue Rail Manufacturing had to custom cut, weld, sand, and paint each section of railing. The company has adopted a standard of 3 labor hours for each section of rail. Skilled labor is anticipated to cost $18 per hour. During August, remember that Blue Rail produced 3,400 sections of railing. Therefore, the standard labor cost for August is calculated as:

Standard Labor Cost

The monthly performance report revealed actual labor cost of $175,000. A closer examination of the actual cost of labor revealed the following:

Actual Labor Cost

The total direct labor variance was favorable $8,600 ($183,600 vs. $175,000). However, detailed variance analysis is necessary to fully assess the nature of the labor variance. As will be shown, Blue Rail experienced a very favorable labor rate variance, but this was offset by significant unfavorable labor efficiency.

Direct Labor Variance Illustration

LABOR RATE VARIANCE

(sr – ar) x ah = ($18 – $14) x 12,500.

The hourly wage rate was lower because of a shortage of skilled welders. Less-experienced welders were paid less per hour, but they also worked slower. This inefficiency shows up in the unfavorable labor efficiency variance:

LABOR EFFICIENCY VARIANCE

(sh – ah) x sr = (10,200 – 12,500) x $18, =<$41,400>, journal entry.

If Blue Rail desires to capture labor variances in its general ledger accounting system, the entry might look something like this:

Labor Variance Journal Entry

Once again, debits reflect unfavorable variances, and vice versa. Such variance amounts are generally reported as decreases (unfavorable) or increases (favorable) in income, with the standard cost going to the Work in Process Inventory account.

The following diagram shows the impact within the general ledger accounts:

Direct Labor Variance T Accounts

Factory Overhead Variances

Variance analysis should also be performed to evaluate spending and utilization for factory overhead. Overhead variances are a bit more challenging to calculate and evaluate. As a result, the techniques for factory overhead evaluation vary considerably from company to company. To begin, recall that overhead has both variable and fixed components (unlike direct labor and direct material that are exclusively variable in nature). The variable components may consist of items like indirect material, indirect labor, and factory supplies. Fixed factory overhead might include rent, depreciation, insurance, maintenance, and so forth. Because variable and fixed costs behave in a completely different manner, it stands to reason that proper evaluation of variances between expected and actual overhead costs must take into account the intrinsic cost behavior. As a result, variance analysis for overhead is split between variances related to variable overhead and variances related to fixed overhead.

Variable Factory Overhead Variances

The cost behavior for variable factory overhead is not unlike direct material and direct labor, and the variance analysis is quite similar. The goal will be to account for the total “actual” variable overhead by applying: (1) the “standard” amount to work in process and (2) the “difference” to appropriate variance accounts.

Review the following graphic and notice that more is spent on actual variable factory overhead than is applied based on standard rates. This scenario produces unfavorable variances (also known as “underapplied overhead” since not all that is spent is applied to production). As monies are spent on overhead (wages, utilization of supplies, etc.), the cost (xx) is transferred to the Factory Overhead account. As production occurs, overhead is applied/transferred to Work in Process (yyy). When more is spent than applied, the balance (zz) is transferred to variance accounts representing the unfavorable outcome.

Unfavorable Variance Illustration

The next illustration is the opposite scenario. When less is spent than applied, the balance (zz) represents the favorable overall variances. Favorable overhead variances are also known as “overapplied overhead” since more cost is applied to production than was actually incurred.

Favorable Variances Illustration

A good manager will want to explore the nature of variances relating to variable overhead. It is not sufficient to simply conclude that more or less was spent than intended. As with direct material and direct labor, it is possible that the prices paid for underlying components deviated from expectations (a variable overhead spending variance). On the other hand, it is possible that the company’s productive efficiency drove the variances (a variable overhead efficiency variance). Thus, the Total Variable Overhead Variance can be divided into a Variable Overhead Spending Variance and a Variable Overhead Efficiency Variance .

Before looking closer at these variances, it is first necessary to recall that overhead is usually applied based on a predetermined rate, such as $X per direct labor hour. This means that the amount debited to work in process is driven by the overhead application approach. This will become clearer with the following illustration.

Blue Rail’s variable factory overhead for August consisted primarily of indirect materials (welding rods, grinding disks, paint, etc.), indirect labor (inspector time, shop foreman, etc.), and other items. Extensive budgeting and analysis had been performed, and it was estimated that variable factory overhead should be applied at $10 per direct labor hour. During August, $105,000 was actually spent on variable factory overhead items. The standard cost for August’s production was as follows:

Standard Production Cost

But, a closer look reveals that overhead spending was quite favorable, while overhead efficiency was not so good. Remember that 12,500 hours were actually worked.

Since variable overhead is consumed at the presumed rate of $10 per hour, this means that $125,000 of variable overhead (actual hours X standard rate) was attributable to the output achieved. Comparing this figure ($125,000) to the standard cost ($102,000) reveals an unfavorable variable overhead efficiency variance of $23,000. However, this inefficiency was significantly offset by the $20,000 favorable variable overhead spending variance ($105,000 vs. $125,000).

This entry applies variable factory overhead to production and records the related variances:

Variable Factory Overhead Journal Entry

The variable overhead efficiency variance can be confusing as it may reflect efficiencies or inefficiencies experienced with the base used to apply overhead. For Blue Rail, remember that the total number of hours was “high” because of inexperienced labor. These welders may have used more welding rods and had sloppier welds requiring more grinding. While the overall variance calculations provide signals about these issues, a manager would actually need to drill down into individual cost components to truly find areas for improvement.

Fixed Factory Overhead Variances

Actual fixed factory overhead may show little variation from budget. This results because of the intrinsic nature of a fixed cost. For instance, rent is usually subject to a lease agreement that is relatively certain. Depreciation on factory equipment can be calculated in advance. The costs of insurance policies are tied to a contract. Even though budget and actual numbers may differ little in the aggregate, the underlying fixed overhead variances are nevertheless worthy of close inspection.

Overhead Variance Illustration

As illustrated, $61,200 should be allocated to work in process. This reflects the standard cost allocation of fixed overhead (i.e., 10,200 hours should be used to produce 3,400 units). Notice that this differs from the budgeted fixed overhead by $10,800, representing an unfavorable Fixed Overhead Volume Variance .

Since production did not rise to the anticipated level of 4,000 units, much of the fixed cost (that was in place to support 4,000 units) was “under-utilized.” For Blue Rail, the volume variance is offset by the favorable Fixed Overhead Spending Variance of $2,000; $70,000 was spent versus the budgeted $72,000. Following is an illustration showing the flow of fixed costs into the Factory Overhead account, and on to Work in Process and the related variances.

Fixed Overhead Variance T Accounts

Following is the entry to apply fixed factory overhead to production and record related volume and spending variances:

Fixed Factory Overhead Journal Entry

The following spreadsheet summarizes the Blue Rail case study. Carefully trace amounts in the spreadsheet back to the illustrations.

Blue Rail Case Study Summary

Notice that the standard cost of $686,800 corresponds to the amounts assigned to work in process inventory via the various journal entries, while the total variances of $32,200 were charged/credited to specific variance accounts. By so doing, the full $719,000 actually spent is fully accounted for in the records of Blue Rail.

Examining Variances

Not all variances need to be analyzed. One must consider the circumstances under which the variances resulted and the materiality of amounts involved. One should also understand that not all unfavorable variances are bad. For example, buying raw materials of superior quality (at higher than anticipated prices) may be offset by reduction in waste and spoilage. Likewise, favorable variances are not always good. Blue Rail’s very favorable labor rate variance resulted from using inexperienced, less expensive labor. Was this the reason for the unfavorable outcomes in efficiency and volume? Perhaps! The challenge for a good manager is to take the variance information, examine the root causes, and take necessary corrective measures to fine tune business operations.

In closing this discussion of standards and variances, be mindful that care should be taken in examining variances. If the original standards are not accurate and fair, the resulting variance signals will themselves prove quite misleading.

Did you learn?
What is a variance?
Be able to calculate and explain material, labor, and overhead variances.
When should variances be investigated?

Variance Analysis

Analysis of the difference between planned and actual numbers

What is Variance Analysis?

Variance analysis can be summarized as an analysis of the difference between planned and actual numbers. The sum of all variances gives a picture of the overall over-performance or under-performance for a particular  reporting period . For each item, companies assess their favorability by comparing actual costs to standard costs in the industry.

For example, if the actual cost is lower than the standard cost for raw materials, assuming the same volume of materials, it would lead to a favorable price variance (i.e., cost savings). However, if the standard quantity was 10,000 pieces of material and 15,000 pieces were required in production, this would be an unfavorable quantity variance because more materials were used than anticipated.

Revenue Variance Analysis Template Screenshot

Learn variance analysis step by step in CFI’s Budgeting and Forecasting course .

The Role of Variance Analysis

When standards are compared to actual performance numbers, the difference is what we call a “variance.” Variances are computed for both the price and quantity of materials, labor, and variable overhead and are reported to management. However, not all variances are important.

Management should only pay attention to those that are unusual or particularly significant. Often, by analyzing these variances, companies are able to use the information to identify a problem so that it can be fixed or simply to improve overall company performance.

Types of Variances

As mentioned above, materials, labor, and variable overhead consist of price and quantity/efficiency variances. Fixed overhead, however, includes a volume variance and a budget variance.

Materials Variance Analysis

The Column Method for Variance Analysis

When calculating for variances, the simplest way is to follow the column method and input all the relevant information. This method is best shown through the example below:

XYZ Company produces gadgets. Overhead is applied to products based on direct labor hours. The denominator level of activity is 4,030 hours. The company’s standard cost card is below:

Direct materials: 6 pieces per gadget at $0.50 per piece

Direct labor: 1.3 hours per gadget at $8 per hour

Variable manufacturing overhead: 1.3 hours per gadget at $4 per hour

Fixed manufacturing overhead: 1.3 hours per gadget at $6 per hour

In January, the company produced 3,000 gadgets. The fixed overhead expense budget was $24,180. Actual costs in January were as follows:

Direct materials: 25,000 pieces purchased at the cost of $0.48 per piece

Direct labor: 4,000 hours were worked at the cost of $36,000

Variable manufacturing overhead: Actual cost was $17,000

Fixed manufacturing overhead: Actual cost was $25,000

Materials Variance

Example of Materials Variance Analysis

Adding these two variables together, we get an overall variance of $3,000 (unfavorable). It is a variance that management should look at and seek to improve. Although price variance is favorable, management may want to consider why the company needs more materials than the standard of 18,000 pieces. It may be due to the company acquiring defective materials or having problems/malfunctions with machinery.

Labor Variance

Example of Labor Variance Analysis

Adding the two variables together, we get an overall variance of $4,800 (Unfavorable). This is another variance that management should look at. Management should address why the actual labor price is a dollar higher than the standard and why 1,000 more hours are required for production. The same column method can also be applied to variable overhead costs. It is similar to the labor format because the variable overhead is applied based on labor hours in this example.

Fixed Overhead Variance

Example of Fixed Overhead Variance Analysis

Adding the budget variance and volume variance, we get a total unfavorable variance of $1,600. Once again, this is something that management may want to look at.

Download the Free Template

Enter your name and email in the form below and download the free template (from the top of the article) now!

Variance Analysis Template

Download the free Excel template now to advance your finance knowledge!

  • First Name *

The Role of Standards in Variance Analysis

In cost accounting , a standard is a benchmark or a “norm” used in measuring performance. In many organizations, standards are set for both the cost and quantity of materials, labor, and overhead needed to produce goods or provide services.

Quantity standards indicate how much labor (i.e., in hours) or materials (i.e., in kilograms) should be used in manufacturing a unit of a product. In contrast, cost standards indicate what the actual cost of the labor hour or material should be. Standards, in essence, are estimated prices or quantities that a company will incur.

Related Reading

This has been CFI’s guide to Variance Analysis. To help you advance your career, check out the additional CFI resources below:

  • Analysis of Financial Statements
  • Financial Statement Normalization
  • Financial Accounting Theory
  • Revenue Recognition Principle
  • See all accounting resources
  • Share this article

Excel Fundamentals - Formulas for Finance

Create a free account to unlock this Template

Access and download collection of free Templates to help power your productivity and performance.

Already have an account? Log in

Supercharge your skills with Premium Templates

Take your learning and productivity to the next level with our Premium Templates.

Upgrading to a paid membership gives you access to our extensive collection of plug-and-play Templates designed to power your performance—as well as CFI's full course catalog and accredited Certification Programs.

Already have a Self-Study or Full-Immersion membership? Log in

Access Exclusive Templates

Gain unlimited access to more than 250 productivity Templates, CFI's full course catalog and accredited Certification Programs, hundreds of resources, expert reviews and support, the chance to work with real-world finance and research tools, and more.

Already have a Full-Immersion membership? Log in

Variance Analysis Impacting Company Financials

The only collaborative  FP&A budgeting software that aligns and engages your entire company.

Variance analysis stands as a cornerstone in financial management, providing key insights into operational performance and financial health. This analysis aids the Office of Finance in pinpointing the reasons behind the deviations of actual results from budgeted figures. Through a meticulous breakdown of price, volume, and mix variances, companies can fine-tune their strategies, enhance efficiency, and optimize profitability.

What is variance analysis?

At its core, variance analysis involves the process of segmenting the difference between planned financial outcomes and actual financial performance into distinct components. This allows for precise identification of the sources of variance, which can be categorized as favorable or unfavorable. Understanding these distinctions is critical for strategic decision-making and effective financial management.

The significance of price variance

Price variance reflects the impact of the difference between actual and expected costs on the company's finances. In the context of sales, it would pertain to the variance caused by selling goods at a price different from the planned price. For expenses, it relates to purchasing materials or services at costs diverging from those budgeted. Analyzing price variances helps organizations adjust their pricing strategies and manage procurement practices more effectively.

Consider a manufacturing company that produces electronic goods. The budgeted cost of copper, a key material, was set at $5 per pound with an expected need of 10,000 pounds per month. However, due to fluctuations in the commodities market, the actual price paid was $5.50 per pound. The price variance, in this case, is unfavorable as the company ended up spending $0.50 more per pound than planned, totaling an additional $5,000 per month. This type of analysis prompts the company to either renegotiate supplier contracts or find alternative materials to mitigate such cost overruns in the future.

Volume variance and its implications

Volume variance is crucial in understanding how the quantity of goods sold or produced affects financial outcomes. This analysis helps ascertain whether financial performance variations are due to selling more or less than anticipated or producing at levels different from the plan. Insights derived from volume variance analysis enable management to align production schedules and sales strategies with market demands and operational capacities.

For example, a company specializing in home appliances planned to sell 50,000 units of a newly launched mixer-grinder but managed to sell only 40,000 units due to a delay in seasonal sales promotions. This resulted in an unfavorable volume variance, leading to lower-than-expected revenue. Learning from this, the company can better align promotional activities with peak buying times to ensure target volumes are met.

Navigating through mix variances

Mix variances occur when the proportion of products sold or resources used differs from the expected mix. This type of variance is particularly important in companies with diverse product lines or multiple cost centers. By analyzing mix variances, financial leaders can uncover inefficiencies in product mix strategies and resource allocation, leading to more informed strategic planning.

In the construction industry, a project budget might allocate costs based on a planned mix of labor and materials, such as higher proportions of skilled labor versus unskilled labor. If a project ends up using more unskilled labor due to the unavailability of skilled laborers, it could lead to a mix variance. This could affect the project’s profitability if unskilled labor is less efficient, requiring more hours to complete the same tasks. This insight would be crucial for future labor planning and contract negotiations.

Harnessing favorable and unfavorable variances

Identifying whether variances are favorable or unfavorable is essential for effective financial stewardship. Favorable variances indicate better-than-expected performance, providing opportunities to capitalize on successful strategies. Conversely, unfavorable variances signal areas needing improvement, prompting immediate attention and corrective actions.

Managing favorable variance: Manufacturing sector

A chemical manufacturer might project the cost of raw materials based on historical prices and consumption rates. Suppose the company manages to negotiate a better deal with suppliers, resulting in a lower cost than budgeted while maintaining the same quality and quantity. This favorable price variance can provide additional budgetary leeway, allowing the company to allocate funds to other areas such as R&D or marketing to strengthen its market position.

Addressing unfavorable variance: Healthcare sector

A hospital anticipated certain operational costs for a fiscal year based on average prices and usage rates of medical supplies. However, a sudden increase in patient admissions due to a local health crisis caused an unfavorable variance as the use of medical supplies exceeded forecasts . This scenario would highlight the need for more flexible procurement strategies to quickly adjust to unexpected increases in demand without significantly impacting operational budgets.

Strategies for managing variance

Effective management of variances involves several strategic actions:

  • Continuous monitoring: Regular analysis of variances helps in maintaining control over financial performance and in initiating timely adjustments.
  • Integrative communication: Facilitating open communication channels across departments ensures that variance insights are integrated into operational planning and decision-making.
  • Adaptive planning: Flexibility in financial planning enables companies to adjust their strategies based on variance analysis findings, enhancing responsiveness to market changes.

Variance analysis is not just about numbers; it's a strategic tool that, when used wisely, can significantly influence a company's financial trajectory. For CFOs and the Office of Finance, mastering variance analysis is essential for fostering robust financial health and steering their companies toward sustained profitability.

By understanding the nuances of price, volume, and mix variances, and effectively managing these variances, financial leaders can ensure that their organizations remain competitive, adaptive, and financially stable in a dynamic business environment. Integrating a sophisticated financial management tool like Centage can be a game-changer in this process. 

‍ To see firsthand how Centage can transform your financial operations and variance analysis, book a demo today and take the first step toward optimizing your financial performance and strategic decision-making.

  • Error message label

Stay in the loop!

Sign up for our newsletter to stay up to date with everything Centage.

Keep reading...

Interviews, tips, guides, industry best practices, and news.

case study on variance analysis

Leveraging Real-Time Data for Proactive Financial Management

case study on variance analysis

How to Read a Budget vs Actual Report

case study on variance analysis

Forecast and Monitor Your Loan Covenants Compliance

  • DOI: 10.1506/8172-1165-6601-3L37
  • Corpus ID: 153683027

A Case Study of a Variance Analysis Framework for Managing Distribution Costs

  • Kevin C Gaffney , V. Gladkikh , R. Webb
  • Published 1 May 2007
  • Accounting Perspectives

5 Citations

The application of management accounting techniques to determine the financial viability of delivery routes in the bread industry: a case study, dynamic and static decomposition analysis of the czech automotive production sector, using prototypes to induce experimentation and knowledge integration in the development of enabling accounting information, why do employees take more initiatives to improve their performance after co-developing performance measures a field study., using prototypes to induce experimentation and knowledge integration in the development of enabling accounting information*: prototypes to induce experimentation and knowledge integration, 9 references, a framework for analysis of sources of profit contribution variance between actual and plan, responsibility cost control system in china: a case of management accounting application, abc of collaborative planning forecasting and replenishment, marginal costing: cost budgeting and cost variance analysis, the strategy-focused organization, variance analysis as an incentive device when payments are based on rank order, target costing and kaizen costing in japanese automobile companies, a disaggregate analysis of ocean carriers' transit time performance, delivering the goods, related papers.

Showing 1 through 3 of 0 Related Papers

Accessibility options:

  • Jump to content
  • Accessibility

For the help you need to support your course

  • Communities

Find resources by…

  • Case study  
  • Topic e.g. The Steps Glossary
  • Resource type e.g. Video, Paper based
  • Let me choose See all
  • Analysis of Variance - Introduction to Analysis of Variance

Introduction to Analysis of Variance resources

Show me all resources applicable to both students & staff students only staff only

01. Case Study Videos (1)

02. video tutorials (1), 03. teach yourself worksheets (2), 07. community project (6), 08. staff resources (7), 10. workshops (1), 11. quick reference worksheet (2).

Creative Commons License

Analysis of Variance

  • First Online: 29 June 2023

Cite this chapter

case study on variance analysis

  • Klaus Backhaus 6 ,
  • Bernd Erichson 7 ,
  • Sonja Gensler 8 ,
  • Rolf Weiber 9 &
  • Thomas Weiber 10  

1425 Accesses

Analysis of variance is a procedure that examines the effect of one (or more) independent variable(s) on one (or more) dependent variable(s). For the independent variables, which are also called factors or treatments, only a nominal scaling is required, while the dependent variable (also called target variable) is scaled metrically. The analysis of variance is the most important multivariate method for the detection of mean differences across more than two groups and is thus particularly useful for the evaluation of experiments. The chapter deals with both the one-factorial (one dependent and one independent variable) and the two-factorial (one dependent and two independent variables) analysis of variance and extends the considerations in the case study to the analysis with two (nominally scaled) independent factors and two (metrically scaled) covariates. Furthermore, contrast analysis and post-hoc testing are also covered.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

This is called inferential statistics and has to be distinguished from descriptive statistics. Inferential statistics makes inferences and predictions about a population based on a sample drawn from the studied population.

In our example, only 5 observations per group and thus a total of 15 observations were chosen in order to make the subsequent calculations easier to understand. The literature usually recommends a minimum of 20 observations per group.

On the website www.multivariate-methods.info we provide supplementary material (e.g., Excel files) to deepen the reader’s understanding of the methodology.

For a brief summary of the basics of statistical testing see Sect.  1.3 .

For a detailed explanation on degrees of freedom (df) see Sect.  1.2.1 .

The user can also choose other values for α . However, α  = 5% is a kind of “gold” standard in statistics and goes back to R. A. Fisher (1890–1962) who developed the F-distribution. However, the user must also consider the consequences (costs) of a wrong decision when making a decision.

The p-value can also be calculated with Excel by using the function F.DIST.RT( F emp ;df1;df2). For our application example, we get: F.DIST.RT(38.09;2;12) = 0.0000064 or 0.00064%. The reader will also find a detailed explanation of the p-value in Sect.  1.3.1.2 .

Guidance on testing the assumption of multivariate normal distribution is given in Sect.  3.5 . A detailed description of the testing of variance homogeneity using the Levene test is given in Sect.  3.4.3 .

The alpha error reflects the probability of rejecting the null hypothesis although it is true. For type I and type II errors, refer to the basics of statistical testing in Sect.  1.3 .

SPSS offers a total of 18 variants of post-hoc tests. See Fig.  3.16 in Sect.  3.3.3.2 .

Please note that the number of 5 observations per group, and thus a total of 30 observations, was chosen in order to make the subsequent calculations easier to understand. In the literature, at least 20 observations per group are usually recommended for a two-way ANOVA.

Here, the different types of interaction are illustrated graphically. The interaction effects in the application example correspond to those in the case study and are shown and explained in Fig.  3.15 .

For didactic reasons, the data of the extended example are also used in the case study (cf. Sect.  3.2.2.1 ; Table 3.9 ). Note that the case study is thus based on a total of 30 cases only. In the literature, a number of at least 20 observations per group is usually recommended.

For the types of interaction effects and the calculation of the interaction effect in the case study, see the explanations in Sect.  3.2.2.1 .

The p-value can also be calculated using Excel by using the function F.DIST.RT( F emp ;df1;df2). For the example in Sect.  3.2.1.1 , we obtain: F.DIST.RT(0,062;2;12) = 0.9402. A detailed explanation of the p-value may be found in Sect.  1.3.1.2 .

Bray, J. H., & Maxwell, S. E. (1985). Multivariate analysis of variance . Sage.

Book   Google Scholar  

Brown, S. R., Collins, R. L., & Schmidt, G. W. (1990). Experimental design and analysis . Sage

Christensen, R. (1996). Analysis of variance, design, and regression: Applied statistical methods . CRC Press.

Google Scholar  

Haase, R. F., & Ellis, M. V. (1987). Multivariate analysis of variance. Journal of Counseling Psychology, 34 (4), 404–413.

Article   Google Scholar  

Kahn, J. (2011). Validation in marketing experiments revisited. Journal of Business Research, 64 (7), 687–692.

Leigh, J. H., & Kinnear, T. C. (1980). On interaction classification. Educational and Psychological Measurement, 40 (4), 841–843.

Levene, H. (1960). Robust tests for equality of variances. In I. Olkin (Ed.), Contributions to probability and statistics. Essays in honor of Harold Hotelling (pp. 278–292). Stanford University Press.

Levy, K. I. (1980). A Monte Carlo study of analysis of covariance under violations of the assumptions of normality and equal regression slopes. Educational and Psychological Measurement, 40 (4), 835–840.

Moore, D. S. (2010). The basic practice of statistics (5th ed.). Freeman.

Perdue, B., & Summers, J. (1986). Checking the success of manipulations in marketing experiments. Journal of Marketing Research, 23 (4), 317–326.

Perreault, W. D., & Darden, W. R. (1975). Unequal cell sizes in marketing experiments: Use of the general linear hypothesis. Journal of Marketing Research, 12 (3), 333–342.

Pituch, K. A., & Stevens, J. P. (2016). Applied multivariate statistics for the social sciences (6th ed.). Routledge.

Shingala, M. C., & Rajyaguru, A. (2015). Comparison of post hoc tests for unequal variance. Journal of New Technologies in Science and Engineering, 2 (5), 22–33.

Smith, R. A. (1971). The effect of unequal group size on Tukey’s HSD procedure. Psychometrika, 36 (1), 31–34.

Warne, R. T. (2014). A primer on multivariate analysis of variance (MANOVA) for behavioral scientists. Practical Assessment, Research & Evaluation, 19 (17), 1–10.

Further Reading

Gelman, A. (2005). Analysis of variance—Why it is more important than ever. The Annals of Statistics, 33 (1), 1–53.

Ho, R. (2006). Handbook of univariate and multivariate data analysis and interpretation with SPSS . CRC Press.

Sawyer, S. F. (2009). Analysis of variance: The fundamental concepts. Journal of Manual & Manipulative Therapy, 17 (2), 27–38.

Scheffe, H. (1999). The analysis of variance . Wiley.

Turner, J. R., & Thayer, J. (2001). Introduction to analysis of variance: Design, analyis & interpretation . Sage Publications.

Download references

Author information

Authors and affiliations.

University of Münster, Münster, Nordrhein-Westfalen, Germany

Klaus Backhaus

Otto-von-Guericke-University Magdeburg, Magdeburg, Sachsen-Anhalt, Germany

Bernd Erichson

Sonja Gensler

University of Trier, Trier, Rheinland-Pfalz, Germany

Rolf Weiber

Munich, Bayern, Germany

Thomas Weiber

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Rolf Weiber .

Rights and permissions

Reprints and permissions

Copyright information

© 2023 Springer Fachmedien Wiesbaden GmbH, part of Springer Nature

About this chapter

Backhaus, K., Erichson, B., Gensler, S., Weiber, R., Weiber, T. (2023). Analysis of Variance. In: Multivariate Analysis. Springer Gabler, Wiesbaden. https://doi.org/10.1007/978-3-658-40411-6_3

Download citation

DOI : https://doi.org/10.1007/978-3-658-40411-6_3

Published : 29 June 2023

Publisher Name : Springer Gabler, Wiesbaden

Print ISBN : 978-3-658-40410-9

Online ISBN : 978-3-658-40411-6

eBook Packages : Business and Economics (German Language)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

case study on variance analysis

  • Free Case Studies
  • Business Essays

Write My Case Study

Buy Case Study

Case Study Help

  • Case Study For Sale
  • Case Study Services
  • Hire Writer

Case Study on Variance Analysis

Variance analysis case study:.

It is important to predict the whole sum of the difference in numbers in order to plan the further development of the company and improvement of its strategies due to these variances in expenditures. Variance analysis is extremely important for the small developing firms, because every extra sum of money is a plus and a chance for the further improvement.

Variance analysis is a useful practice which can make the development of business more rapidly and more effective, because if the company produces similar goods, it can predict the variance on all its production and use the extra money effectively.The topic on variance analysis is quite interesting and important for every future accountant. If one is asked to complete a successful variance analysis case study, he will have to read on the topic a lot to improve his knowledge on the problem under research. There are many reliable sources which can be at hand in the process of writing and help a student become aware of the aspects and principles of variance analysis. A student is supposed to investigate the topic professionally, collect the appropriate data for the research and draw the wise conclusions in the end. One should research the case site, discover the nature of the problem, find out about its cause and effect.

We Will Write a Custom Case Study Specifically For You For Only $13.90/page!

A student is expected to brainstorm the effective solutions to the suggested problem on variance analysis to demonstrate his professional skills.The assignment of case study writing is treated like a complicated process of writing, because this paper has its own peculiarities and rules which have to be fulfilled. With the professional writing assistance of the Internet it is possible to prepare a good paper yourself if you take advantage of a free example case study on variance analysis. It is possible to complete a well­-organized assignment just having looked through a good free sample case study on standard costing and variance analysis in the prepared by an expert in the web.

Related posts:

  • Case: Coal and Variance
  • Case Study on Value Chain Analysis
  • Case Study on Ratio Analysis
  • Writing a Case Study Response
  • Case Study on Data Analysis
  • Case Study on Job Analysis
  • Case Study on Gap Analysis

' src=

Quick Links

Privacy Policy

Terms and Conditions

Testimonials

Our Services

Case Study Writing Services

Case Studies For Sale

Our Company

Welcome to the world of case studies that can bring you high grades! Here, at ACaseStudy.com, we deliver professionally written papers, and the best grades for you from your professors are guaranteed!

[email protected] 804-506-0782 350 5th Ave, New York, NY 10118, USA

Acasestudy.com © 2007-2019 All rights reserved.

case study on variance analysis

Hi! I'm Anna

Would you like to get a custom case study? How about receiving a customized one?

Haven't Found The Case Study You Want?

For Only $13.90/page

CUSTOMER SUCCESS STORY

Productive Serves Makerstreet as a Single Source of Truth

Makerstreet is an Amsterdam-based collective of agencies with over 300 employees in four offices.

Agency Valuation Calculator Report

See the 2023 Global Agency Valuations Report

Book a Demo

Try Productive

Comparisons

{{minutes}} min read

Cost Variance in Project Management: How to Calculate It

Lucija Bakić

August 28, 2024

Cost variance in project management helps project managers keep their cost baseline under control.

This is why cost variance analysis is one of the core project management metrics . In this article, we’ll discuss how you can calculate your cost variance, what you can use it for, different types of project costs, and how you can minimize them. 

What Is Cost Variance in Project Management?

Cost variance is a metric that depicts the difference between your expected project cost and your actual cost at a certain point in time. A positive cost variance means that your project budget is on track, while a negative variance signifies budget overrun.

How to Calculate Cost Variance?

The basic formula for cost variance is:

projected cost – actual cost = cost variance

Projected cost is also sometimes called earned value (EV), since it reflects on your work completed. To give a practical example, let’s say your original budget is $10,000. Let’s say you’ve completed 50% of the project; your projected cost or earned value would amount to $5,000. However, you’ve actually spent only $4,000. This means that your cost variance is $1,000, meaning that you’re not only within your budget but that you’ve spent less than expected. For the cost variance percentage, use the following formula:

cost variance / projected cost x 100 = cost variance percentage

Using the example above, $1,000 / $5,000 is 20%, meaning that your budget is 20% over the expected amount.

Automate your project performance insights with Productive

 If you don’t want to do manual calculations, you can use project management software like Productive for real-time data access. Head over to our section on tracking cost variance with Productive to learn more.

Types of Cost Variance Formulas

There are also a couple of different ways to measure your budget spend and get insights from multiple perspectives, including:

  • Point-in-time cost variance: compares cost variance within a chosen period of time
  • Cumulative cost variance: considers variance from the start of the project up to a chosen point in time
  • Variance at completion: considers variance from the start of the project up to its completion

A full understanding of cost variance requires using all three methods. Point-in-time and cumulative cost variance help project managers take proactive measures to correct unfavorable cost variance, while variance at completion is necessary for delivering budget reports and promoting cost efficient workflows.

The Importance of Cost Variance Analysis

Cost variance analysis helps project managers identify exactly the point at which budgets begin to diverge from expectations. This allows them to pinpoint specific roadblocks and address them before the project gets derailed. This makes the concept of cost variance an essential part of project financial management . Additionally, building a better understanding of how to control actual cost supports future projects; it helps set better estimates and measures for monitoring your budgeting in stages. Cost variance also supports accountability and transparency between project stakeholders. Project managers who have a handle over their finances during various time periods can present this data to clients, which helps manage expectations, control scope creep, and increase client satisfaction overall.

We also love the ability to invite our clients into the projects. It takes the middle man out of the equation , no need to go back and forth via e-mail, we can get all of the feedback within Productive. This also lets the client see how much work is actually going into the project and you can see that they have a greater appreciation for what we do.

Alex Streltsov , General Manager at Prolex Media

Learn how to optimize your project cost control and collaboration with Productive.

Types of Project Costs

There are two main types of project costs: direct and indirect costs. Direct cost is usually involved closely with the production of goods and services — in the case of professional services, this usually includes employee salaries. Indirect costs, also known as overhead, are tied to activities that don’t produce revenue. Overhead can also be classified into two types: 1. Fixed overhead , which includes costs that remain constant and don’t change with business activity levels. For example:

  • Facility costs (office space rent, equipment, etc.)
  • Salaries of non-billable employees and professional services (sales, HR, office management, etc.)
  • Licenses and plans (web hosting, software licenses, etc.)

2. Variable overhead costs , or expenses which fluctuate depending on the volume of goods used or services provided, and include:

  • Salary or benefit components (employee vacations, bonuses, etc.)
  • Equipment maintenance (building repairs, vehicle repairs, etc.)
  • Marketing (advertising or PR investments, etc.)

Project managers will need to understand and consider all of these cost elements to optimize their management.

Tips for Efficient Cost Management

Now that you know what you need to calculate and how to do it, here are three tips to prevent and mitigate cost overrun in project management :

Realistic Estimates

Without realistic estimates, it’s impossible to prevent cost overruns. Businesses might commit to delivering more than they’re capable of for various reasons: fear of losing a client, inexperience, and lack of historical data. While the first reason requires a mindset change, the other two can be more easily addressed. Businesses should consider implementing a more thorough project planning and risk management process to improve estimation. Implementing project management tool s that help gather past project data can help pinpoint historical roadblocks and address them so they don’t impact future projects. Regularly revisiting and updating these estimates throughout the project lifecycle also allows businesses to adapt to changes promptly, another thing for which software is invaluable.

Understanding Profitability

Even if your costs have exceeded their initial estimates, that doesn’t necessarily mean that either the project or your business operations will be negatively impacted. You might still be able to retain a positive profit margin . This is why it’s so important to have budgeting and profitability information in one place; in these cases, going over budget can be a good idea in order to satisfy client requests and deliver a better service.

With Productive, I’m understanding new things about profitability. I’ve made certain assumptions before, and some of those assumptions have proven to be wrong. For some projects, we weren’t sure how far over budget we were , and now we can really see.

Roberto Ciarleglio, Co-founder and Managing Director of Contra Agency

Employee Utilization

Finally, efficient resource allocation ties into both your estimating and profitability. This involves assigning the right people to the right tasks and ensuring that their skills are fully utilized without overburdening them. This is done by calculating employee utilization , or the percentage of billable hours worked vs total hours worked. Low utilization means that project teams are missing out on billable work and, consequently, revenue. It can signify issues with your workflows, such as miscommunication, lack of organization on tasks, client issues, etc. On the other hand, if your utilization is too high, this can mean that your project team is on the track to burnout. Unexpected employee attrition can have a significant impact on your project success, not to mention the costs of finding and onboarding new staff. This is why utilization and capacity planning are so important to effective cost management.

Balance your team workloads with Productive

Manage Your Cost Variance With Productive

With Productive, you don’t have to manually gather your estimated vs actual cost to monitor your budget burn. Instead, Productive takes into account your employee billable rates and logged hours to update your budgets in real time. In the example below, you’ll see that 5 billable hours tracked x billable rate of $100/h = $500 budget used. You can also set automatic warnings on your budget; for example, when 50% of billable hours have been tracked. You’ll get an email notification letting you know that it’s time to check your estimated vs actual costs.

Manage your Budgeting across individual clients

To monitor the overall trend of your budget over the project duration, you can use the cumulative view. For a point-in-time analysis, switch to the per-period view and get a snapshot of a selected period of time.

You can also filter your data by year, quarter, month, or week.

To see whether you’re on top or over your budget, you can check the Budget tab under the chart. With a negative cost variance, your Budget remaining will be under 0% (and have a red line to indicate overflow), while a positive cost variance will be over 0% (with a green line indicating how much you have left to spend). To make your data more accurate, you can also enable overhead costs and expenses. Productive calculates your true profit per hour by deducting employee cost rates (salaries) and overhead cost per hour (facility cost + internal cost such as time off) from billable rates.

Productive’s overhead calculator generates overhead cost per hour

With expenses, you can manage various additional costs and even set up an approval process to verify them beforehand.

Manage all of your expenses and payments

Going back to our Budgeting overlay, you can also switch to the Profitability overlay to monitor profit margins and even predict these metrics. How does this work? By scheduling your project team with Productive’s Resource Planning ahead of time, you can get your estimate at completion (budget at completion or the total cost of a project once it’s finished) to take corrective actions on time and improve financial performance.

Manage your project schedule and project scope

Productive’s additional features include:

  • Various project views, including Gantt, Workload, and Kanban
  • Task management and time tracking
  • Agency rate card management, proposals, and invoicing
  • Employee utilization and availability management
  • No-code automations and custom permissions
  • Report creation and sharing

You can check out Productive with a 14-day free trial.

Productive Is the All-in-One Agency Management Tool

Switch from multiple tools and spreadsheets to a single platform for project planning and project cost management.

Start Free Trial

Book a demo

Other Types of Variance Analysis Metrics

Variance analysis can be applied to other metrics, including:

  • Schedule variance: depicts the difference between expected vs actual project timelines.
  • Labor rate: depicts the difference between the expected vs actual cost of labor.
  • Materials: depicts the difference between the expected vs actual cost of materials.
  • Sales: depicts the difference between the expected vs selling price for goods or services.
  • Overhead: depicts the difference between the expected vs actual cost of materials.

Conclusion: Managing Your Budget and Variances

In short, a project manager needs to understand and manage cost variance regularly to get control over the project’s financial health and ensure a positive outcome. By calculating different variances—point-in-time, cumulative cost variance, and variance at completion—you can take proactive measures to prevent budget overruns and ensure profitability. Tools like Productive simplify this process by providing real-time insights into your project cost management, from budgeting to profitability analysis . By leveraging Productive’s comprehensive features, including resource planning and cost tracking, you can enhance your decision-making, optimize resource utilization, and ultimately deliver successful projects within budget. Book a demo with Productive to start optimizing your cost performance today.

Connect With Agency Peers

Access agency-related Slack channels, exchange business insights, and join in on members-only live sessions.

Content Specialist

Related articles

Uncategorized

Employee Utilization: Formula, Benchmarks & How to Optimize It

Best accounting software for advertising agencies in 2024, project management metrics: 5 kpis and how to track them.

Building Productive

Brand Guidelines

Trust Center

Integrations

Automations

Permission and User Access

Software Development

Marketing Agency

Business Consultancy

Design Studio

In-house Team

Customer Stories

End-to-end Agency Management

Agency Resource Management

Product Updates

The Bold Community

Workflowmax

TOP GUIDES & TOOLS

Resource Planning Guide

Capacity Planning Guide

Workforce Planning Guide

Workload Management Guide

Billable Hours Guide

Project Budget Guide

Revenue Operations Guide

Agency Valuation Calculator

Billable Hours Calculator

© The Productive Company

Privacy Policy

Terms & Conditions

We need your consent to continue

Necessary cookies

Cookies for the basic functionality of the Productive website.

Functional cookies

Cookies for additional functionality and increased website security.

Targeting cookies

Advertising and analytics service cookies that create day-to-day statistics and show ads on their site and on the advertiser’s partners websites.

Save changes

Manage cookies and help us deliver our services. By using our services, you agree to our use of cookies.

Try Productive for free

Free 14-day trial. No credit card required. Cancel any time.

Already using Productive? Sign in with an existing account

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Med Internet Res
  • v.21(6); 2019 Jun

Logo of jmir

Examining Cost Measurements in Production and Delivery of Three Case Studies Using E-Learning for Applied Health Sciences: Cross-Case Synthesis

Edward meinert.

1 Department of Primary Care and Public Health, Digital Global Health Unit, Imperial College London, London, United Kingdom

2 Department of Paediatrics, Healthcare Translation Research Group, University of Oxford, Oxford, United Kingdom

Abrar Alturkistani

Kimberley a foley, david brindley, associated data.

Case study protocol – educating administrative staff to engage with young patients.

Case study protocol – the impact of climate change on public health.

Case study protocol – data science in healthcare using real world evidence.

Ingredient costs variance calculation.

The World Health Report (2006) by the World Health Organization conveys that a significant increase is needed in global health care resourcing to meet the current and future demand for health professionals. Electronic learning (e-Learning) presents a possible opportunity to change and optimize training by providing a scalable means for instruction, thus reducing the costs for training health professionals and providing patient education. Research literature often suggests that a benefit of e-Learning is its cost-effectiveness compared with face-to-face instruction, yet there is limited evidence with respect to the comparison of design and production costs with other forms of instruction or the establishment of standards pertaining to budgeting for these costs.

To determine the potential cost favorability of e-Learning in contrast to other forms of learning, there must first be an understanding of the components and elements for building an e-Learning course. Without first taking this step, studies lack the essential financial accounting rigor for course planning and have an inconsistent basis for comparison. This study aimed to (1) establish standard ingredients for the cost of e-Learning course production and (2) determine the variance instructional design has on the production costs of e-Learning courses.

This study made use of a cross-case method among 3 case studies using mixed methods, including horizontal budget variance calculation and qualitative interpretation of responses from course designers for budget variance using total quality management themes. The different implementation-specific aspects of these cases were used to establish common principles in the composition of budgets in the production and delivery of an applied health professional e-Learning course.

A total of 2 case studies reported significant negative budget variances caused by issues surrounding underreporting of personnel costs, inaccurate resource task estimation, lack of contingency planning, challenges in third-party resource management, and the need to update health-related materials that became outdated during course production. The third study reported a positive budget variance because of the cost efficiency derived from previous implementation, the strong working relationship of the course project team, and the use of iterative project management methods.

Conclusions

This research suggests that the delivery costs of an e-Learning course could be underestimated or underreported and identifies factors that could be used to better control budgets. Through consistent management of factors affecting the cost of course production, further research could be undertaken using standard economic evaluation methods to evaluate the advantages of using e-Learning.

Introduction

The World Health Report (2006) by the World Health Organization (WHO) [ 1 ] conveys that a significant increase is needed in global health care resourcing to meet the current and future demand for health professionals. Current challenges to health care resourcing include the increasing demand resulting from the aging population’s need for chronic disease management, in addition to the growing population placing an increased demand on primary care [ 2 ]. This increased demand on resources requires a scalable means to train resources; opportunities to optimize training through alternatives to face-to-face instruction present the possibility of increasing the pace and breadth of education to health care resourcing. A 2015 WHO systematic review of e-Learning for undergraduate health professional education concluded that “computer-based and Web-based e-Learning is no better and no worse than face-to-face learning with regards to knowledge and skill acquisition” [ 3 ]. e-Learning is defined as “an approach to teaching and learning, representing all or part of the educational model applied, that is based on the use of electronic media and devices as tools for improving access to training, communication, and interaction and that facilitates the adoption of new ways of understanding and developing learning” [ 4 ]. It presents a possible opportunity to change and optimize training in health professions (including clinical, allied, and applied health sciences, as well as patient education) by providing a scalable means for instruction, thus reducing the costs necessary in delivery and implementation. If we accept that pedagogically e-Learning can result in a positive educational effect used under optimal circumstances, which is still subject to ongoing investigation, there remains the possibility that deployment of e-Learning could affect the scale, cost, and reach of health professions education.

Research Problem

One of the motivations for implementing e-Learning is the potential long-term efficiency gain in its delivery model [ 5 , 6 ]. A course delivered digitally versus the cost of a lecturer providing face-to-face instruction appears to have long-term cost favorability [ 7 ]. The literature often suggests that a benefit of Web-based learning is its cost-effectiveness compared with face-to-face instruction [ 8 ]; however, there is limited evidence validating comparison with other forms of instruction or standards for the budgeting of the costs in the production and execution of e-Learning courses. In the case of massive open online courses (MOOCs), there is limited evidence on the costs associated with their production [ 9 ]. In addition, the costs to develop an e-Learning course are significant when executed to a high standard. Although there are studies that capture data relating to factors associated with educational costs, measurement in these studies are collected inconsistently and include a wide variety of factors [ 3 , 10 ]. There is limited transparency in costing models because of sensitivity on where direct costs should be applied [ 11 ]. A systematic means is required to comprehensively record costs that can then subsequently enable testing of whether the e-Learning course has desirable economic properties and under what scenarios [ 12 ]. If proven so, this could assist in addressing the high cost of delivering health professions education. By contrast, should evidence point the other way, having discrete data points will allow those involved in online health education to identify ways to optimize costs in delivery. The primary issue here is identification of the direct and indirect costs in implementation, which then allows the execution of further economic evaluation.

Aims and Objectives

This aim of this study was to establish an approach for identifying costs in the design, development, and deployment of applied health (defined as applied health subjects) sciences e-Learning courses and to subsequently propose a budgeting framework for the planning and management of e-Learning course implementations. The costs in this study include the direct and indirect costs from inception through course delivery. This approach will allow course designers and implementers to leverage knowledge gained from the study’s e-Learning case studies across different implementation contexts to better plan and manage future implementations, which will also create a reusable framework to apply cost planning. This work will demonstrate the effect in pre-implementation budget management against the proposed framework and should result in better course planning.

The study’s objectives are as follows:

  • Establish an approach to capture standard components or ingredients for the cost of the production of an e-Learning course.
  • Determine the effect that instructional design has on the production costs of e-Learning courses.

The study’s aims and objectives intend to address a gap in the research literature concerning implementation details on planning and executing e-Learning in health professions education [ 8 ]. In addition to limited cost-centered studies on e-Learning for health professions education, there are limited details on how course designers and producers are calculating the associated costs for production of these course types. Developing models will allow for the adoption of data sharing and course planning for improved management in execution of this course method and for further refinement and analysis. To explore this issue, this research examines the following 3 distinct e- Learning implementations as case studies.

Educating Administrative Staff to Engage With Young Patients

The course was created as a small private online course (SPOC) to prepare general practice administrative staff for issues in the management of adolescents. The course used case studies to provide training to help general practice staff feel confident in helping adolescents with a goal of improving the patient experience.

The Impact of Climate Change on Public Health

This course was created as an MOOC to educate citizens on the relationship between climate change and public health by using a multidisciplinary academic framework in data science to analyze, interpret, and present evidence. Core case studies focused on climate change and its health economic effect on local, regional, and national health systems.

Data Science in Health Care Using Real-World Evidence

This course was created as a blended MOOC to make learners aware of the effect data science can have on medicine and inspire the application of these methods across various undergraduate curriculum disciplines, the UK National Health Service commissioning support organizations, health care regulation organizations, and life sciences industries (ie, pharmaceuticals, biotechnology, and medical devices).The implementation of the blended MOOC was executed as a face-to-face course for learners; learners first took part in the MOOC and were then offered a residential course examining case studies. The target audience of the MOOC was allied health professionals or citizens looking to transition or enhance skills in data science in health care–related industries such as the pharmaceutical industry or biotech organizations. One of the key objectives of the course was to establish a global network of people to continue and advance the dialogue on data science in health care. Some of the course outcomes include the use and application of real-world evidence data collection and analysis techniques in health care settings.

A mixed-methods case study design was selected to support a systematic means of observing the subject of investigation [ 13 ] and the ability to combine quantitative and qualitative approaches [ 14 ]. Mixed-methods research presents an opportunity to combine the strengths of quantitative and qualitative research to counteract the limitations inherent when each method is used in isolation [ 14 ]. In this study, for example, the limitations of quantitatively isolating cost differences in the 3 cases are strengthened by the repeatable and generalizable nature of the qualitative approach used to interpret results. Case studies were selected based on their relevance to the study inquiry and the ability to capture, record, and analyze data from each case. Each study was structured through a study protocol to govern the case execution.

Case Study Overview

Case overview.

The objective of the case study is to inform the way future costs are budgeted in the development of e-Learning courses. The research forms part of a broader investigation into the costs associated with e-Learning course production; the main focus of each case was to collect primary evidence in the construction of these costs to allow for further research comparing results with other Web-based learning implementation types.

  • Study question: how are the total costs for the production and delivery of an e-Learning course (dependent on type) calculated?
  • Proposition: actual and budgeted costs will vary in the production or delivery of this course type.

Existing research literature indicates challenges in the capture of total costs for the production of Web-based learning despite standard methods for cost calculation [ 8 ]. The reason for this variance is likely because the skills required to create instructional learning design and to capture costs are different, and educators are not trained in cost accounting methods.

The analytical framework for this investigation is based on the cost analysis methods underpinning education economic evaluation developed by Levin [ 15 ], which extends the standard costing and variance calculation principles of activity-based costing [ 16 - 18 ]. The ingredients method [ 15 ] is used to capture total cost production against cost categories. It examines the core composition of costs in the delivery of an education intervention; this is an activity-based costing approach that seeks to understand the core components required for delivery. Defining core costs is critical to performing further economic evaluations, though it is important to note that the scope of this research is limited to cost identification and not further economic analysis (eg, cost-benefit analysis, cost-effectiveness analysis, cost-utility analysis, and cost-feasibility analysis).

Case study protocols ( Multimedia Appendices 1 , 2 , and 3 ,) were developed at study commencement to demonstrate the way costs would be captured and analyzed. These protocols, in addition to a protocol for qualitative and quantitative analysis of learning effect (which is outside the scope of the cost investigation) [ 19 ] were drafted, submitted, peer reviewed, and approved.

Data Collection Procedures

Evidence to be expected.

To validate the costs reported in the actual budget (which was an actual cost report), at least 2 separate sources confirming the final reported amount were sought (eg, for a reported incurred cost for staff, timesheets were reviewed to match hours to costs, task completion, and assignment in a project plan). These data comparisons increased the likelihood that reported data were accurate.

Events to Be Observed

Although the course implementation was observed and additional studies completed investigating the education effect, the scope of this study was centered on the cost decision making, and the way production affected cost delivery. Therefore, the observation scope for this study focused on reported costs and the way these correlated data to time actuals.

Documentation to Be Reviewed

The project budget, actual costs, and timesheets were reviewed for this study. Although there will be a review of the completed course and observation of the way the course uptake is completed, the latter shall be excluded from this study. A traceability log was maintained in Microsoft Excel linking the research questions to data sources and the study findings.

Protocol Questions

Study question: how are the total costs for the production and delivery of an e-Learning (type dependent on implementation type) course calculated?

  • The costs will be measured and ingredients captured and analyzed to understand the factors affecting course production.
  • Data will be collected to support the cost analysis categories.
  • The corresponding evidence will be used to summarize ways that cost capture practices could be improved.

Study Framework

Each case study followed a 6-stage process in the investigation ( Table 1 ) [ 13 ]. The research question centered on identifying the total costs of production and delivery in these e-Learning implementations, and the effect of factors on variance from anticipated budgets. It was selected because evidence from the literature suggests inconsistency in the determination of costs for the delivery of Web-based courses [ 20 ]. This is significant because the lack of consistent cost capture mechanisms for Web-based learning compromises any further evaluation. Despite available methods to avoid this outcome, the literature presents research with claims that Web-based learning is more cost effective than face-to-face learning. This research provides a structured means to generate evidence to subsequently evaluate such claims by collecting baseline data on course production for further evaluation.

Case study framework .

StageOutcome
PlanCase description and linking of case approach to investigation outcomes.
DesignConstruction of research design and linkage of research questions, data, and criteria for evaluation and synthesis.
PrepareDraft, execution, and approval of study protocols.
CollectData collection strategy executed from a perspective to capture the decision making of the course designers centered on cost attributes.
AnalyzeData extracted into categories for review and analyzed for variance calculation. Data analysis centers on 3 cost categories in the design of the preproduction budget submitted to the funder for each case. Category A: concept and measurement of costs: The preproduction budget was analyzed for the following ingredient categories: (1) personnel, (2) estate charges, (3) equipment and materials, (4) indirect costs, and (5) stakeholder costs; Category B: placing values on ingredients: With the full cost of production defined, values were associated with each ingredient subcategory to reflect the chargeable cost; Category C: calculating costs: To record a variance calculation, a comparison of the budget with the incurred costs was reviewed on a quarterly basis. Variance=Actual spending–Budgeted spending.
ShareThe findings of the variance calculation and synthesis of analysis of reasons leading to variation were presented in a report for publication in a peer-reviewed journal. (This study).

The research design ( Table 2 ) was structured on 4 components (proposition, the case definition, logic linking data to the proposition, and criteria for interpreting findings) to explore the following research question: how are the total costs for the production and delivery of e-Learning calculated (with the e-Learning implementation type variant depending on the case study)? Given the inconsistency in the presentation of costs in the literature and recognizing that using budgets to determine the cost of educational delivery is insufficient [ 21 ], the governing proposition of the investigation was that there would be variance between the budgeted costs and the actual costs to produce the course. This was explored through cases that would examine the cost and the measurement of costs and place value on ingredients. Levin developed this ingredients method to capture and analyze the costs in the delivery of an educational program. To link the case to the proposition, the cost calculation was completed and then interpreted via a variance calculation of actual to budgeted costs, and rationales were developed to justify variations.

Case study research design .

Case (year)Study questionPropositionThe case (definition)Logic linking data to the propositionCriteria for interpreting findings
Case 1: Educating administrative staff to engage with young patients (2016) [ ]How are the total costs for the production and delivery of this e-Learning course calculated?Actual and budgeted costs will vary in the production/delivery of this course typeDetermination and measurement of costsCost analysis of project, actual, and underreported costsVariance calculation from the project budget
Case 2: The impact of climate change on public health (2017) [ ]How are the total costs for the production and delivery of this e-Learning course calculated?Actual and budgeted costs will vary in the production/delivery of this course typeDetermination and measurement of costsCost analysis of project, actual, and underreported costsVariance calculation from the project budget
Case 3: Data science in healthcare using real world evidence (2018) [ ]How are the total costs for the production and delivery of this e-Learning course calculated?Actual and budgeted costs will vary in the production/delivery of this course typeDetermination and measurement of costsCost analysis of project, actual, and underreported costsVariance calculation from the project budget

Examination of these cases provides data to analyze the relationship between course production and budgeting in the delivery of e-Learning and provides evidence for constructing accurate budget models.

Each case was tested for construct validity (testing that data sources come from multiple sources), external validity (testing that demonstrates how principal findings could be extensible) and reliability (testing that shows how the activities of the study can be replicated) to ensure data triangulation, the ability for study replication, and standardization for project data collection [ 13 ]. Ethical approval for each study was obtained through the Imperial College Education Ethics Research Committee (case 1: EERP1516-005; case 2 and 3: EERP1617-030).

The investigation was focused on cost measurement and analysis, structured by 3 cost categories, and further subdivided using a 7-step process (illustrated in Table 3 below) to analyze the pre- and postproduction budget [ 21 ]. Levin’s model uses an activity-based standard-costing accountancy approach, which assigns costs as they are consumed per implementation area [ 25 , 26 ].

Course production ingredients cost analysis.

Cost categoriesObjectives—adapted from Levin (2001, 2018) [ , ]
Category A: concept and measurement of costsSteps 1 to 5: Describe the concept of costs; show the inadequacy of budgets for cost analysis; present a methodology for measuring costs; identify categories of cost ingredients; describe sources of cost information
Category B: placing values on ingredientsSteps 6 and 7: Describe the purpose and principles for determining the values of ingredients; present methods for placing values on specific types of ingredients

Evidence from the course was retrieved from project documents and records of finance activity. The data collection strategy was executed from a realist perspective to capture the decisions made by the course designers; however, it did not incorporate a relativist perspective with regard to stakeholders, through further qualitative investigation. This decision was made to avoid interference in course delivery. To control biased selectivity and reporting bias, the data were sourced through multiple sources, including finance logs (and notes), data submitted to the employer, the funder, and timesheets. A traceability log was maintained linking the study questions to the relevant data sources and the study findings.

Data analysis centered on the 3 cost categories and followed the 7-step process for cost definition.

Category A: Concept and Measurement of Costs

The preproduction budget was analyzed for the following ingredient categories: (1) personnel, (2) estate charges, (3) equipment and materials, (4) indirect costs and (5) stakeholder costs. The initial budgets did not reflect time for stakeholder costs (effort from third-party lecturers); therefore, this was captured as the additional time that was monitored in the study (and added for budget variance calculation), as there was no value for this in the data submitted to the funder.

Category B: Placing Values on Ingredients

With the full cost of production defined, values were associated with each ingredient subcategory to reflect the chargeable cost (including direct and indirect costs).

Category C: Calculating Costs

As each course was implemented in 1 year, and the courses were Web-based, there were no multiyear costs to calculate; the one-time cost of the project and the variance of the projected budget to the actual budget were the only variables under consideration. To accomplish this, the variance calculation of the budget to the incurred costs was undertaken at the completion of the project. The variance calculation compares actual costs to adjusted standard conditions based on occurrence [ 28 ].

The variance calculation formula is as follows: Variance = Actual spending − Budgeted spending.

Analyzing Costs of Observed Budget Variance Calculations

To determine the reasons for favorable or negative budget variance, the course designers were interviewed to determine the factors contributing to budget variance. This qualitative work was planned via the consolidated criteria for reporting qualitative research [ 29 ] to ensure that the appropriate trained staff conducted interviews, study design included the purposeful sampling of the course designers, sessions could be validated in the interviews, and the resultant analysis and findings would be repeatable [ 29 ]. The sessions were conducted as semistructured interviews transcribed and coded using thematic analysis [ 30 ] using total quality management (TQM) as coding criteria. TQM [ 31 ] is a quality appraisal method used to analyze factors affecting operational efficiency [ 32 ]. TQM provides a means to categorize issues relating to people, process, or technology through applying a systems approach to management (see Figure 1 ). For each area of cost variance, the course designers were asked to review budget reports to identify stages in the project lifecycle for variances in forecast and to describe the contributing factors. After the interview, these were coded independently by 2 researchers to create a novel means of interpreting the cost calculation variance. For example, if a cost variance was attributed to stakeholder costs, the researchers would examine reported quarterly budgets (or at the project time interval) and determine where the variance began occurring. If the variance commenced during the build stage of the project, the project plan was analyzed, and questions surrounding the activities of the project were asked of the course designers to determine the root cause.

An external file that holds a picture, illustration, etc.
Object name is jmir_v21i6e13574_fig1.jpg

Isolating variance during project stage to total quality management criteria.

The key themes for the TQM analysis are presented in each case indicating the summary perspective of areas for improvement or efficiency in e-Learning budget creation.

The findings of the variance calculation and the deductive-inductive interpretation of reasons leading to variation were presented in a case report to the course design and production team. Feedback was gathered on analysis and results; the key findings for each report were prepared for publication for a peer-review journal.

Cross-Case Synthesis

To derive results from the composite analysis of the cases, this study makes use of cross-case study synthesis [ 13 ] as illustrated in Figure 2 . The standard variables in the cases are centered on ingredients and their incurred cost variance from budget.

An external file that holds a picture, illustration, etc.
Object name is jmir_v21i6e13574_fig2.jpg

Cross-case synthesis. MOOC: massive open online course; SPOC: small private online course.

Course Production Costs

Costs for each case were summarized into components and separated into ingredient cost categories ( Table 4 ).

Ingredient categories .

Ingredient categoriesCost components
PersonnelUniversity staff
Estate chargesInformation technology services charges
Equipment and materialsCourse production equipment and application development costs for the creation of software to support the massive open online course
Indirect costsUniversity overheads
Stakeholder costsStaff for third-party subject matter consultancy

Upon completion of the analysis of the ingredients of the course production, initial budgets were created and submitted to the funder.

Category C: Analyzing costs

Budget variance calculation.

The project implementation costs, in this case, had a negative variance of 41% ( Multimedia Appendix 4 ). The most significant negative variance (135%; Multimedia Appendix 4 ) was in equipment and materials, primarily from the costs of app development in the creation of a Web-based course. As the production team had not created a Web-based course before, there was a significant underestimation of the amount of time required to build and configure the system (which was developed using the Open edX learning management system platform) and complete course editing. In addition, specialist recording equipment had to be procured that was not understood at the time of budget completion. The next most substantial negative variance (76%; Multimedia Appendix 4 ) was the amount of time required from third-party stakeholders in the production of learning materials. The amount of time allocated for recording the lecturers was underestimated; there had to be several re-runs of the recordings to address content changes. The lowest negative cost variance (31%; Multimedia Appendix 4 ) was in the personnel costs to deliver the course. Although the variance was the smallest of the 3 categories, it was significant because the course production team did not receive any additional compensation for their additional work; this extra work was captured in the project timesheets but not submitted to the funder for reimbursement.

The actual costs varied from the budgeted cost in personnel, equipment and materials, and stakeholder costs, and the total cost of production has a negative variance of 113% ( Multimedia Appendix 4 ) from the budgeted amount. The most significant variance was in stakeholder costs, where the total time for external lecturers and subject matter experts to deliver work was significantly underbudgeted, with a negative variance of 190% ( Multimedia Appendix 4 ). The reason for this underestimate was that videos had to be reshot twice and the amount of time allocated to retrieve stakeholders and complete associated course updates dramatically affected the budget. The second largest variance was in personnel; the cost variance was directly related to the additional production time required for the video reshoots, in addition to the iteration of the development of the platform. The course implementation online learning provider also switched from edX to FutureLearn learning management system during the project, requiring rework of previously completed tasks. As the team was not experienced on the FutureLearn platform, this further accounted for additional effort and the unfavorable budget variance; a team with experience and training on design for the course material would most likely have attained different results. Finally, equipment and materials were also underestimated with a negative variance of 133% ( Multimedia Appendix 4 ), having to do with additional software required for video editing and additional workstations gathered to deal with additional editing required in the course development.

In contrast to the previous case studies, this case demonstrated a positive variance of 16% ( Multimedia Appendix 4 ) from the initial budget. Stakeholder costs for subject matter expert lecturers were slightly overestimated but close to budget. It is important to note that the third-party stakeholder team had significant previous experience working together for producing related coursework, and this could have led to the precision in effort estimation. Equipment and materials had a significant positive variance of 37% ( Multimedia Appendix 4 ); the reason for this is that not all the equipment planned for the course development was necessary because there was efficiency derived in the course production and streamlining of data science modules that were thought to have required custom app development. Personnel had a negative variance of 13%; this was related to additional effort required in video editing. In addition, the course was completed ahead of schedule and in less time than was anticipated.

The construction of the cost ingredients and subsequent cost analysis underwent 3 validation tests ( Table 5 ).

Cross-case results validation tests .

CaseConstruct validityExternal validityReliability
1To achieve data triangulation, the case study had multiple sources of cost data. (1) The project budget that was submitted to the project funder, (2) the actual costs submitted to the funder at the completion of the project, and (3) the timesheet log of hours captured by the course implementers. The final case report was reviewed, and feedback gathered from the course designers (BS, MT); any inconsistencies or inaccuracies were corrected.By using Levin’s ingredients method for cost identification, the case followed an established costing procedure that is used as the basis for analytic frameworks for economic evaluation in education. This process based on a common analytic framework allows for the generalization of the study findings to similar use cases.A study protocol was created at the commencement of the case; the protocol details the structure of the study and details how data were collected to ensure the reliability of the results.
2Multiple sources of cost data and reporting data were used to validate that data sources were an accurate record of what occurred. (1) The project budget created at the project commencement, (2) the actual cost report submitted at the completion of the project, (3) the timesheet log of hours captured by each team resource, (4) a third-party work-log for course production and monitor of billable hours recorded charged to the program, (5) external audit reports on the course construction, and (6) review of notes from monthly reviews of budget spend. The final case report was reviewed, and feedback gathered from the course designers (BS, MT); feedback was provided and reviewed by the research team to ensure implementation accuracy.The repetition of a model used in prior research [ ], application of Levin’s ingredients method for education intervention analysis, and use of standard costing and variance calculation activity-based costing methods demonstrated a common analytic framework that is transportable to other studies.To achieve this test, a study protocol was used and formed the governing basis for the study.
3The data sources for each ingredient category were sourced from (1) the initial project budget, (2) reported submitted costs, (3) a time log of hours worked, and (4) a third-party work-log of the activities of subcontracted courses. The final case report was reviewed to ensure accuracy.The same process that was used in the 2 previous cases was replicated [ ], and application of Levin’s ingredients method for education intervention analysis demonstrated a common analytic framework transportable to other electronic learning studies.A minor variation of the previous study protocols executed was used and stored as the governance framework for the study.

Issues affecting budget variance were classified using TQM to categorize factors influencing the budget ( Table 6 ). Although each course was implemented with a varying form of e-Learning, the issues affecting each case were similar and cross-applicable. The critical consideration in budgeting is less an aspect of the type of e-Learning, but more the planning associated with the project management of the creation of the course.

Total quality management category of issues affecting budget adherence to the model.

CasesIssuePeopleProcessTechnology
Case 1The inadequacy of project budgets at the commencement of Web-based learning for new teamsX X
Underreporting of personnel costsXX
Case 2Resource task estimation and managementX
Contingency planningX
Third-party resource managementXX
Need for an update of course materialsXX
Case 3Cost efficiencies in the delivery of a course piloted in previous yearsX
Experience and relationship of the course learning teamX
Agile project management methods and iterative budget managementX

a Not applicable.

b Applicable.

Project Management

Each case implemented project management methods for the organization of crucial deliverables and tasks in their design and integrated learning design methodology in different ways. Case 1 employed project-related task-centered actions constructed to match each learning outcome. Case 2 integrated the analysis, design, development, implementation, and evaluation (ADDIE) model, and course planning was structured along each of these design stages, whereas case 3 implemented an agile project management model (with iterations) while using the ADDIE model in course construction.

Participant Information

A total of 124 learners enrolled in the SPOC from September 2016 to December 2016 ( Table 7 ). Of these, 84% completed the course and received a postcourse certificate. The course uptake and completion, however, did not influence the production costs postcourse implementation as the course was designed as a self-managed SPOC not requiring further administration after deployment.

Electronic learning implementation participation summary .

Case (year)Learners, nCompletion, %
1: Educating administrative staff to engage with young patients (2016)12484
2: The impact of climate change on public health (2017) [ ]96817
3: Data science in health care using real world evidence (2018)503612

A total of 968 learners participated in the MOOC from November 2017 to December 2017 ( Table 7 ). Of these, 17% completed the course. The course completion ratio was in line with completion rates for MOOCs [ 33 ], where although there is a high uptake of initial learners, completion of course activity ranges from 8% to 20%.

A total of 5036 learners participated in the MOOC from September 2018 to December 2018 ( Table 7 ). Of these, 12% completed the course. The course completion ratio was also in line with completion rates for MOOCs [ 33 ]. A blended residential course was held in November 2018, with the participation of 14 learners (these learners were inclusive in the MOOC set). In this residential course, the participants completed the MOOC as prelearning and then undertook case studies, putting course learning into practice.

Principal Findings

This study aimed to establish an approach for identifying the costs in the design, development, and deployment of applied health professions e-Learning courses. The standard components for the construction of an e-Learning course were determined by the methods used in this study, which combined existing approaches for cost budgeting with qualitative methods for the interpretation of results. Although Levin’s ingredients method provides a mechanism for categorizing costs design and implementation costs for budgeting, TQM provides a qualitative framework to examine the effect of the design and production decisions on the budget. The key issues affecting the ability of the budget to deliver in line with expectations at the close of the project were related to process issues. Familiarization with technology was also a key issue in cases 1 and 2, where familiarity with production methods and learning technology had an effect on anticipated effort.

The key recommendations made from examination of these cases center on 3 areas of process-related enhancement, 1 having to do with project management and the remaining 2 having to do with budget management, both related to the course production and instructional design:

Project Management: Linkage of Instructional Design Method to Stages in the Project Lifecycle With Time Tracking

Project management enables the planning and prioritizing of activities; management of risk, issues, and actions; and ensuring quality. In these observed cases, the use of robust project management methods and the development of iterative methods to validate learning materials tended to create favorable results. In addition, linking an instructional design approach to project stages and tracking tasks by time to each component creates awareness and links the associated financial effect of delivery to course building.

Budget Planning: Use of Confidence Factors in Budget Time Estimating

A vital issue in all cases was overestimating the amount of effort required to build tasks. To better manage time tracking, we have suggested tracking task by time linked to learning design, but as an additional measure, building confidence factors into budgets allows a degree of error and contingency when building initial budgets. A confidence factor is a percentage of variance added to an initial cost forecast that can be added as a contingency; applying confidence factors based on requirements, the familiarity of approach, and other factors can lead to higher estimation precision.

Budget Planning: Modeling Budget Forecasting on Similar Implementations

Case 3 was the most successful in delivery because the course team had worked together delivering similar content, was able to gain efficiency in having preexisting relationships, and had an evidence base to build their cost models from. When planning e-Learning implementations, the starting point should similarly be previous projects or using data from the literature on factors influencing costs, so budgets are not determined from scratch. Part of the observed budget variance issues in cases 1 and 2 had to do with estimates for costs not built on prior evidence; this can be controlled by using an experience-driven starting point.

Strengths and Limitations

This study analyzes 3 distinct cases of e-Learning covering 6128 applied health learners in 3 years and provided a comprehensive summary of the issues affecting the production and development of a course. This information could be useful for course designers in the planning of their e-Learning implementations and for drawing on lessons learned to plan budgets that ensure projects meet their objectives.

We noted 4 limitations with this study. Case study research can only provide a snapshot of activities as observed in each case, and there is a possibility that these cases may have limited applicability to other contexts. This has been mitigated using construct validity, external validity, and reliability tests in each case, but it is important to note that case study research has an inherent limitation in the observation of events under consideration due to the design; experimental methods deliver more rigorous results to test results. In addition, the selection of the case studies was opportunistic, as they were e-Learning projects accessible within the first author’s research unit. The second limitation is that further qualitative investigation of attitudes, views, and perceptions of stakeholders was not undertaken. This would have added an additional dataset to analyze factors affecting budgeting, meaning that the researchers drew conclusions from data that may have been viewed differently with further direct inquiry from stakeholders. It is important to note however, that stakeholders did review final case reports for accuracy and consistency with events. The third limitation is that the study did not undertake critical examination of the decisions made by the course designers in authoring tools, license costs, expertise, and other factors affecting the direct costs; examination of these costs including triangulation among the 3 sources would lead to further evidence affecting results. Finally, the study made use of a mixed-methods approach to analyze horizontal budget analysis but did not undertake an analysis for offsetting or magnifying variances, return on investment, forecasting, sensitivity analysis, or other financial planning and analysis methods. An economic study focused on outcomes and cost could provide further data that would potentially influence implementation considerations.

Further Research

The outputs of this study, in addition to the process of execution and reflection on both strengths and limitations, suggest 3 possible areas for future research:

Standards for Costing Economic Evaluations of e-Learning Implementations

Limited economic evaluations are conducted on e-Learning, most likely because educators focus on content delivery and educational effect rather than creating cost evidence. This study has created an extension of existing costing methods and demonstrated how it can be applied to e-Learning, allowing future researchers to reuse this approach to create consistent costing data, which could be subsequently benchmarked. With a growing evidence base of e-Learning cost data, this could also promote further research into various forms of economic evaluation, to create possible business cases for future investment in e-Learning, should value be demonstrated.

Integration of Project Management, Instructional Design Methods, and Costing

This study observed benefits in the combination of project management methods and instructional design methods; further research investigating ways of adopting existing instructional design methods with project management methodologies and linking these methods with cost management approaches could help address the high investment cost required in e-Learning.

Cost and Value Perceptions of Students and Educators

Using improved cost data from the approaches in this research, further research could attempt to identify perceptions of cost and value by comparing the perspectives of students and educators.

e-Learning research consistently refers to the promise and opportunity of its cost-effectiveness in contrast to face-to-face instruction; however, the underlying data supporting the costs necessary for their delivery are not well understood [ 8 ]. To implement further economic evaluation to understand proprieties demonstrating the value of e-Learning in contrast to other learning types, it is first necessary to develop a standard means of calculating costs in the delivery of these types of projects. Through consistent management of factors affecting costs in course production, further research could be undertaken using standard economic evaluation methods to evaluate the advantages of using e-Learning. This study enables an understanding of the issues affecting cost planning for the design, development, and deployment of e-Learning courses and also provides recommendations on controlling cost variance within e-Learning projects. This study contributes a systematic approach to costing in e-Learning that course designers and researchers could use to design and calculate costs in the production and deployment.

Acknowledgments

This project was supported by the European Institute of Innovation and Technology—EIT Health Knowledge and Innovation Community and the Higher Education Funding Council for England—Catalyst Fund. Boris Serafimov and Mel Toumazos provided detailed data contributing to the study in the design, development, and deployment of the e-Learning courses examined in this study. Yusuf Ermak and Hassan Chaudhury provided valuable contributions to the Data Science Courses as lecturers.

Abbreviations

ADDIEanalysis, design, development, implementation, and evaluation
MOOCmassive open online course
SPOCsmall private online course
TQMtotal quality management
WHOWorld Health Organization

Multimedia Appendix 1

Multimedia appendix 2, multimedia appendix 3, multimedia appendix 4.

Authors' Contributions: EM conceived the study topic, wrote the first draft, responded to peer-review feedback, and is the principal investigator on the research project. DB, KF, and AA reviewed the completed draft manuscripts and provided feedback on iterations. JC supervised the investigation. EM is the guarantor.

Conflicts of Interest: None declared.

Lean Six Sigma Training Certification

6sigma.us

  • Facebook Instagram Twitter LinkedIn YouTube
  • (877) 497-4462

SixSigma.us

Quantitative Data Analysis Methods. Applications, Methods, and Case Studies

August 29th, 2024

The ability to properly analyze and understand numbers has become very valuable, especially in today’s time.

Analyzing numerical data systematically involves thoughtfully collecting, organizing, and studying data to discover patterns, trends, and connections that can guide important choices.  

Key Highlights

  • Analyzing data numerically involves gathering info, organizing it neatly, and examining the numbers to gain insights and make choices informed by data.
  • It involves various methods like descriptive statistics, predictive modeling, machine learning, and other statistical techniques. These help make sense of everything.
  • For businesses, researchers, and organizations, it’s important to analyze numbers to spot patterns, relationships, and how things change over time within their info.
  • Doing analyses allows for data-driven decision-making, projecting outcomes, assessing risks intelligently, and refining strategies and workflows. Finding meaning in the metrics helps optimize processes.

What is Quantitative Data Analysis?

Analyzing numbers is useful for learning from information. It applies stats methods and computational processes to study and make sense of data so you can spot patterns, connections, and how things change over time – giving insight to guide decisions.

At the core, quantitative analysis builds on math and stats fundamentals to turn raw figures into meaningful knowledge.

The process usually starts with gathering related numbers and organizing them neatly. Then analysts use different statistical techniques like descriptive stats, predictive modeling, and more to pull out valuable lessons.

Descriptive stats provide a summary of the key details, like averages and how spread out the numbers are. This helps analysts understand the basics and find any weird outliers.

Inferential stats allow analysts to predict broader trends based on a sample. Things like hypothesis testing , regression analysis, and correlation investigations help identify significant relationships.

Machine learning and predictive modeling have also enhanced working with numbers. These sophisticated methods let analysts create models that can forecast outcomes, recognize patterns across huge datasets, and uncover hidden insights beyond basic stats alone.

Leveraging data-based evidence supports more informed management of resources.

Data Collection and Preparation

The first step in any quantitative data analysis is collecting the relevant data. This involves determining what data is needed to answer the research question or business objective.

Data can come from a variety of sources such as surveys, experiments, observational studies, transactions, sensors, and more. 

Once the data is obtained, it typically needs to go through a data preprocessing or data cleaning phase.

Real-world data is often messy, containing missing values, errors, inconsistencies, and outliers that can negatively impact the analysis if not handled properly. Common data cleaning tasks include:

  • Handling missing data through imputation or case deletion
  • Identifying and treating outliers 
  • Transforming variables (e.g. log transformations)
  • Encoding categorical variables
  • Removing duplicate observations

The goal of data cleaning is to ensure that quantitative data analysis techniques can be applied accurately to high-quality data. Proper data collection and preparation lays the foundation for reliable results.

In addition to cleaning, the data may need to be structured or formatted in a way that statistical software and data analysis tools can read it properly.

For large datasets, data management principles like establishing data pipelines become important.

Descriptive Statistics of Quantitative Data Analysis

Descriptive statistics is a crucial aspect of quantitative data analysis that involves summarizing and describing the main characteristics of a dataset.

This branch of statistics aims to provide a clear and concise representation of the data, making it easier to understand and interpret.

Descriptive statistics are typically the first step in analyzing data, as they provide a foundation for further statistical analyses and help identify patterns, trends, and potential outliers.

The most common descriptive statistics measures include:

  • Mean : The arithmetic average of the data points.
  • Median : The middle value in a sorted dataset.
  • Mode : The value that occurs most frequently in the dataset.
  • Range : The difference between the highest and lowest values in the dataset.
  • Variance : The average of the squared deviations from the mean.
  • Standard Deviation : The square root of the variance, providing a measure of the spread of data around the mean.
  • Histograms : Visual representations of the distribution of data using bars.
  • Box Plots : Graphical displays that depict the distribution’s median, quartiles, and outliers.
  • Scatter Plots : Displays the relationship between two quantitative variables.

Descriptive statistics play a vital role in data exploration and understanding the initial characteristics of a dataset.

They provide a summary of the data, allowing researchers and analysts to identify patterns, detect potential outliers, and make informed decisions about further analyses.

However, it’s important to note that descriptive statistics alone do not provide insights into the underlying relationships or causal mechanisms within the data.

To draw meaningful conclusions and make inferences about the population, inferential statistics and advanced analytical techniques are required.

Inferential Statistics

While descriptive statistics provide a summary of data, inferential statistics allow you to make inferences and draw conclusions from that data.

Inferential statistics involve taking findings from a sample and generalizing them to a larger population. This is crucial when it is impractical or impossible to study an entire population.

The core of inferential statistics revolves around hypothesis testing . A hypothesis is a statement about a population parameter that needs to be evaluated based on sample data.

The process involves formulating a null and alternative hypothesis, calculating an appropriate test statistic, determining the p-value, and making a decision whether to reject or fail to reject the null hypothesis.

Some common inferential techniques include:

T-tests – Used to determine if the mean of a population differs significantly from a hypothesized value or if the means of two populations differ significantly.

ANOVA ( Analysis of Variance ) – Used to determine if the means of three or more groups are different.  

Regression analysis – Used to model the relationship between a dependent variable and one or more independent variables. This allows you to understand drivers and make predictions.

Correlation analysis – Used to measure the strength and direction of the relationship between two variables.

Inferential statistics are critical for quantitative research, allowing you to test hypotheses, establish causality, and make data-driven decisions with confidence in the findings.

However, the validity depends on meeting the assumptions of the statistical tests and having a properly designed study with adequate sample sizes.

The interpretation of inferential statistics requires care. P-values indicate the probability of obtaining the observed data assuming the null hypothesis is true – they do not confirm or deny the hypothesis directly. Effect sizes are also crucial for assessing the practical significance beyond just statistical significance.

Predictive Modeling and Machine Learning

Quantitative data analysis goes beyond just describing and making inferences about data – it can also be used to build predictive models that forecast future events or behaviors.

Predictive modeling uses statistical techniques to analyze current and historical data to predict unknown future values. 

Some of the key techniques used in predictive modeling include regression analysis , decision trees , neural networks, and other machine learning algorithms.

Regression analysis is used to understand the relationship between a dependent variable and one or more independent variables.

It allows you to model that relationship and make predictions. More advanced techniques like decision trees and neural networks can capture highly complex, non-linear relationships in data.

Machine learning has become an integral part of quantitative data analysis and predictive modeling. Machine learning algorithms can automatically learn and improve from experience without being explicitly programmed.

They can identify hidden insights and patterns in large, complex datasets that would be extremely difficult or impossible for humans to find manually.

Some popular machine learning techniques used for predictive modeling include:

  • Supervised learning (decision trees, random forests, support vector machines)
  • Unsupervised learning ( k-means clustering , hierarchical clustering) 
  • Neural networks and deep learning
  • Ensemble methods (boosting, bagging)

Predictive models have a wide range of applications across industries, from forecasting product demand and sales to identifying risk of customer churn to detecting fraud.

With the rise of big data , machine learning is becoming increasingly important for building accurate predictive models from large, varied data sources.

Quantitative Data Analysis Tools and Software

To effectively perform quantitative data analysis, having the right tools and software is essential. There are numerous options available, ranging from open-source solutions to commercial platforms.

The choice depends on factors such as the size and complexity of the data, the specific analysis techniques required, and the budget.

Statistical Software Packages

  • R : A powerful open-source programming language and software environment for statistical computing and graphics. It offers a vast collection of packages for various data analysis tasks.
  • Python : Another popular open-source programming language with excellent data analysis capabilities through libraries like NumPy, Pandas, Matplotlib, and sci-kit-learn.
  • SPSS : A commercial software package widely used in academic and research settings for statistical analysis, data management, and data documentation.
  • SAS : A comprehensive software suite for advanced analytics, business intelligence, data management, and predictive analytics.
  • STATA : A general-purpose statistical software package commonly used in research, especially in the fields of economics, sociology, and political science.

Spreadsheet Applications

  • Microsoft Excel : A widely used spreadsheet application that offers built-in statistical functions and data visualization tools, making it suitable for basic data analysis tasks.
  • Google Sheets : A free, web-based alternative to Excel, offering similar functionality and collaboration features.

Data Visualization Tools

  • Tableau : A powerful data visualization tool that allows users to create interactive dashboards and reports, enabling effective communication of quantitative data.
  • Power BI : Microsoft’s business intelligence platform that combines data visualization capabilities with data preparation and data modeling features.
  • Plotly : A high-level, declarative charting library that can be used with Python, R, and other programming languages to create interactive, publication-quality graphs.

Business Intelligence (BI) and Analytics Platforms

  • Microsoft Power BI : A cloud-based business analytics service that provides data visualization, data preparation, and data discovery capabilities.
  • Tableau Server/Online : A platform that enables sharing and collaboration around data visualizations and dashboards created with Tableau Desktop.
  • Qlik Sense : A data analytics platform that combines data integration, data visualization, and guided analytics capabilities.

Cloud-based Data Analysis Platforms

  • Amazon Web Services (AWS) Analytics Services : A suite of cloud-based services for data analysis, including Amazon Athena, Amazon EMR, and Amazon Redshift.
  • Google Cloud Platform (GCP) Data Analytics : GCP offers various data analytics tools and services, such as BigQuery, Dataflow, and Dataprep.
  • Microsoft Azure Analytics Services : Azure provides a range of analytics services, including Azure Synapse Analytics, Azure Data Explorer, and Azure Machine Learning.

Applications of Quantitative Data Analysis

Quantitative data analysis techniques find widespread applications across numerous domains and industries. Here are some notable examples:

Business Analytics

Businesses rely heavily on quantitative methods to gain insights from customer data, sales figures, market trends, and operational metrics.

Techniques like regression analysis help model customer behavior, while clustering algorithms enable customer segmentation. Forecasting models allow businesses to predict future demand, inventory needs, and revenue projections.

Healthcare and Biomedical Research with Quantitative Data Analysis

Analysis of clinical trial data, disease prevalence statistics, and patient outcomes employs quantitative methods extensively.

Hypothesis testing determines the efficacy of new drugs or treatments. Survival analysis models patient longevity. Data mining techniques identify risk factors and detect anomalies in healthcare data.

Marketing and Consumer Research

Marketing teams use quantitative data from surveys, A/B tests, and online behavior tracking to optimize campaigns. Regression models predict customer churn or likelihood to purchase.

Sentiment analysis derives insights from social media data and product reviews. Conjoint analysis determines which product features impact consumer preferences.

Finance and Risk Management with Quantitative Data Analysis

Quantitative finance relies on statistical models for portfolio optimization, derivative pricing, risk quantification, and trading strategy formulation. Value at Risk (VaR) models assess potential losses. Monte Carlo simulations evaluate the risk of complex financial instruments.

Social and Opinion Research

From political polls to consumer surveys, quantitative data analysis techniques like weighting, sampling, and survey data adjustment are critical. Researchers employ methods like factor analysis, cluster analysis, and structural equation modeling .

Case Studies

Case study 1: netflix’s data-driven recommendations.

Netflix extensively uses quantitative data analysis, particularly machine learning, to drive its recommendation engine.

By mining user behavior data and combining it with metadata about movies and shows, they build predictive models to accurately forecast what a user would enjoy watching next.

Case Study 2: Moneyball – Analytics in Sports

The adoption of sabermetrics and analytics by baseball teams like the Oakland Athletics, as depicted in the movie Moneyball, revolutionized player scouting and strategy.

By quantifying player performance through new statistical metrics, teams could identify undervalued talent and gain a competitive edge.

Quantitative data analysis is a powerful toolset that allows organizations to derive valuable insights from their data to make informed decisions.

By applying the various techniques and methods discussed, such as descriptive statistics, inferential statistics , predictive modeling , and machine learning, businesses can gain a competitive edge by uncovering patterns, trends, and relationships hidden within their data.

However, it’s important to note that quantitative data analysis is not a one-time exercise. As businesses continue to generate and collect more data, the analysis process should be an ongoing, iterative cycle.

If you’re looking to further enhance their quantitative data analysis capabilities, there are several potential next steps to consider:

  • Continuous learning and skill development : The field of data analysis is constantly evolving, with new statistical methods, modeling techniques, and software tools emerging regularly. Investing in ongoing training and education can help analysts stay up-to-date with the latest advancements and best practices.
  • Investing in specialized tools and infrastructure : As data volumes continue to grow, organizations may need to invest in more powerful data analysis tools, such as big data platforms, cloud-based solutions, or specialized software packages tailored to their specific industry or use case.
  • Collaboration and knowledge sharing : Fostering a culture of collaboration and knowledge sharing within the organization can help analysts learn from each other’s experiences, share best practices, and collectively improve the organization’s analytical capabilities.
  • Integrating qualitative data : While this article has focused primarily on quantitative data analysis, incorporating qualitative data sources, such as customer feedback, social media data, or expert opinions, can provide additional context and enrich the analysis process.
  • Ethical considerations and data governance : As data analysis becomes more prevalent, it’s crucial to address ethical concerns related to data privacy, bias, and responsible use of analytics.

Implementing robust data governance policies and adhering to ethical guidelines can help organizations maintain trust and accountability.

SixSigma.us offers both Live Virtual classes as well as Online Self-Paced training. Most option includes access to the same great Master Black Belt instructors that teach our World Class in-person sessions. Sign-up today!

Virtual Classroom Training Programs Self-Paced Online Training Programs

SixSigma.us Accreditation & Affiliations

PMI-logo-6sigma-us

Monthly Management Tips

  • Be the first one to receive the latest updates and information from 6Sigma
  • Get curated resources from industry-experts
  • Gain an edge with complete guides and other exclusive materials
  • Become a part of one of the largest Six Sigma community
  • Unlock your path to become a Six Sigma professional

" * " indicates required fields

  • Saturday, August 31, 2024

businessday logo

© 2023 - Businessday NG. All Rights Reserved.

  • Introduction
  • Conclusions
  • Article Information

eTable 1. Classification of Indications

eTable 2. Dechallenge and Rechallenge With Semaglutide

eTable 3. Dechallenge and Rechallenge With Liraglutide

eTable 4. Sex, Median Age and Dose (IQR) for Suicidal Ideation ADRs by Drug and Indication

eTable 5. Coreported Psychiatric Reactions for Semaglutide

eTable 6. Coreported Psychiatric Reactions for Liraglutide by Indication

eTable 7. Number of Cases, Noncases, Other Adverse Drug Reactions (ADRs) and Total Number of Other Reports in the Database for Semaglutide

eTable 8. Number of Cases, Noncases, Other Adverse Drug Reactions (ADRs) and Total Number of Other Reports in the Database for Liraglutide

eTable 9. Disproportionality Analysis of Semaglutide-Associated Suicidal Ideation Compared With All Other Drugs in the Database in Female and Male Patients Separately

eTable 10. Number of Semaglutide-Associated Cases of Adverse Drug Reactions (ADRs) by Year

eTable 11. Number of Liraglutide-Associated Cases of Adverse Drug Reactions (ADRs) by Year

eReferences.

Data Sharing Statement

  • GLP-1 Receptor Agonists and Suicidality JAMA Network Open Invited Commentary August 20, 2024 Francesco Salvo, MD, PhD; Jean-Luc Faillie, MD, PhD

See More About

Sign up for emails based on your interests, select your interests.

Customize your JAMA Network experience by selecting one or more topics from the list below.

  • Academic Medicine
  • Acid Base, Electrolytes, Fluids
  • Allergy and Clinical Immunology
  • American Indian or Alaska Natives
  • Anesthesiology
  • Anticoagulation
  • Art and Images in Psychiatry
  • Artificial Intelligence
  • Assisted Reproduction
  • Bleeding and Transfusion
  • Caring for the Critically Ill Patient
  • Challenges in Clinical Electrocardiography
  • Climate and Health
  • Climate Change
  • Clinical Challenge
  • Clinical Decision Support
  • Clinical Implications of Basic Neuroscience
  • Clinical Pharmacy and Pharmacology
  • Complementary and Alternative Medicine
  • Consensus Statements
  • Coronavirus (COVID-19)
  • Critical Care Medicine
  • Cultural Competency
  • Dental Medicine
  • Dermatology
  • Diabetes and Endocrinology
  • Diagnostic Test Interpretation
  • Drug Development
  • Electronic Health Records
  • Emergency Medicine
  • End of Life, Hospice, Palliative Care
  • Environmental Health
  • Equity, Diversity, and Inclusion
  • Facial Plastic Surgery
  • Gastroenterology and Hepatology
  • Genetics and Genomics
  • Genomics and Precision Health
  • Global Health
  • Guide to Statistics and Methods
  • Hair Disorders
  • Health Care Delivery Models
  • Health Care Economics, Insurance, Payment
  • Health Care Quality
  • Health Care Reform
  • Health Care Safety
  • Health Care Workforce
  • Health Disparities
  • Health Inequities
  • Health Policy
  • Health Systems Science
  • History of Medicine
  • Hypertension
  • Images in Neurology
  • Implementation Science
  • Infectious Diseases
  • Innovations in Health Care Delivery
  • JAMA Infographic
  • Law and Medicine
  • Leading Change
  • Less is More
  • LGBTQIA Medicine
  • Lifestyle Behaviors
  • Medical Coding
  • Medical Devices and Equipment
  • Medical Education
  • Medical Education and Training
  • Medical Journals and Publishing
  • Mobile Health and Telemedicine
  • Narrative Medicine
  • Neuroscience and Psychiatry
  • Notable Notes
  • Nutrition, Obesity, Exercise
  • Obstetrics and Gynecology
  • Occupational Health
  • Ophthalmology
  • Orthopedics
  • Otolaryngology
  • Pain Medicine
  • Palliative Care
  • Pathology and Laboratory Medicine
  • Patient Care
  • Patient Information
  • Performance Improvement
  • Performance Measures
  • Perioperative Care and Consultation
  • Pharmacoeconomics
  • Pharmacoepidemiology
  • Pharmacogenetics
  • Pharmacy and Clinical Pharmacology
  • Physical Medicine and Rehabilitation
  • Physical Therapy
  • Physician Leadership
  • Population Health
  • Primary Care
  • Professional Well-being
  • Professionalism
  • Psychiatry and Behavioral Health
  • Public Health
  • Pulmonary Medicine
  • Regulatory Agencies
  • Reproductive Health
  • Research, Methods, Statistics
  • Resuscitation
  • Rheumatology
  • Risk Management
  • Scientific Discovery and the Future of Medicine
  • Shared Decision Making and Communication
  • Sleep Medicine
  • Sports Medicine
  • Stem Cell Transplantation
  • Substance Use and Addiction Medicine
  • Surgical Innovation
  • Surgical Pearls
  • Teachable Moment
  • Technology and Finance
  • The Art of JAMA
  • The Arts and Medicine
  • The Rational Clinical Examination
  • Tobacco and e-Cigarettes
  • Translational Medicine
  • Trauma and Injury
  • Treatment Adherence
  • Ultrasonography
  • Users' Guide to the Medical Literature
  • Vaccination
  • Venous Thromboembolism
  • Veterans Health
  • Women's Health
  • Workflow and Process
  • Wound Care, Infection, Healing

Get the latest research based on your areas of interest.

Others also liked.

  • Download PDF
  • X Facebook More LinkedIn

Schoretsanitis G , Weiler S , Barbui C , Raschi E , Gastaldon C. Disproportionality Analysis From World Health Organization Data on Semaglutide, Liraglutide, and Suicidality. JAMA Netw Open. 2024;7(8):e2423385. doi:10.1001/jamanetworkopen.2024.23385

Manage citations:

© 2024

  • Permissions

Disproportionality Analysis From World Health Organization Data on Semaglutide, Liraglutide, and Suicidality

  • 1 The Zucker Hillside Hospital, Department of Psychiatry, Northwell Health, Glen Oaks, New York
  • 2 Department of Psychiatry, Zucker School of Medicine at Northwell/Hofstra, Hempstead, Glen Oaks, New York
  • 3 Department of Psychiatry, Psychotherapy and Psychosomatics, Hospital of Psychiatry, University of Zurich, Zurich, Switzerland
  • 4 Institute of Pharmaceutical Sciences, Department of Chemistry and Applied Biosciences, ETH Zurich, Zurich, Switzerland
  • 5 Institute of Primary Care, University of Zurich and University Hospital Zurich, Zurich, Switzerland
  • 6 WHO Collaborating Centre for Research and Training in Mental Health and Service Evaluation, Department of Neuroscience, Biomedicine and Movement Sciences, Section of Psychiatry, University of Verona, Verona, Italy
  • 7 Pharmacology Unit, Department of Medical and Surgical Sciences, University of Bologna, Bologna, Italy
  • 8 Institute of Social and Preventive Medicine, University of Bern, Bern, Switzerland
  • Invited Commentary GLP-1 Receptor Agonists and Suicidality Francesco Salvo, MD, PhD; Jean-Luc Faillie, MD, PhD JAMA Network Open

Question   Are glucagon-like peptide-1 receptor agonists semaglutide and liraglutide, which were originally introduced for the treatment of type 2 diabetes and are frequently prescribed due to their weight loss properties, associated with disproportionately increased reporting of suicidality?

Findings   This disproportionality analysis through the case-control design based on the World Health Organization global database collecting suspected adverse drug reactions, identified a disproportionality signal of suicidal ideation with semaglutide, which remained significant when comparing semaglutide with dapagliflozin and metformin and in the subgroup of patients with coreported use of antidepressants and benzodiazepines.

Meaning   A detected signal of semaglutide-associated suicidal ideation warrants urgent clarification.

Importance   Glucagon-like peptide-1 receptor agonists (GLP-1 RAs) have gained use primarily due to their weight-reduction effects, although a regulatory review was undertaken for potential suicidality concern.

Objectives   To evaluate potential signals for suicidal and self-injurious adverse drug reactions (ADRs) associated with the GLP-1 RAs semaglutide and liraglutide.

Design, Setting, and Participants   Disproportionality analysis through the case-control design using the World Health Organization (WHO) global database of suspected ADRs. Participants were clinical patients worldwide experiencing an ADR suspectedly attributable to semaglutide or liraglutide in the database from inception to August 30, 2023. Data were analyzed from September to December 2023.

Exposure   Treatment with semaglutide or liraglutide regardless of indication or treatment duration.

Main Outcomes and Measures   Reporting odds ratio (ROR) and the bayesian information component (IC) with 95% CIs were calculated as measures of disproportionate reporting of suicidal and self-injurious ADRs associated with semaglutide and liraglutide compared with all other medications. Sensitivity analyses were conducted including patients with coreported use of antidepressants and benzodiazepines and using dapagliflozin, metformin, and orlistat as comparators. A disproportionality signal was considered when the lower limits of the ROR and IC were above 1 and 0, respectively.

Results   A total of 107 (median [IQR] age 48 [40-56] years; 59 female patients [55%]) and 162 (median [IQR] age 47 [38-60] years; 100 female patients [61%]) cases of suicidal and/or self-injurious ADRs were reported between November 2000 and August 2023 with semaglutide and liraglutide, respectively. Significant disproportionality was detected only for semaglutide-associated suicidal ideation (ROR, 1.45; 95% CI, 1.18-1.77; IC, 0.53; 95% CI, 0.19-0.78), which remained significant in patients with coreported use of antidepressants (ROR, 4.45; 95% CI, 2.52-7.86; IC, 1.96; 95% CI, 0.98-2.63) and benzodiazepines (ROR, 4.07; 95% CI, 1.69-9.82; IC, 1.67; 95% CI, 0.11-2.65), when compared with dapagliflozin (ROR, 5.56; 95% CI, 3.23-9.60; IC, 0.70; 95% CI, 0.36-0.95), metformin (ROR, 3.86; 95% CI, 2.91-5.12; IC, 1.20; 95% CI, 0.94-1.53) and orlistat (ROR, 4.24; 95% CI, 2.69-6.69; IC, 0.70; 95% CI, 0.36-0.95).

Conclusions and Relevance   This study using the WHO database found a signal of semaglutide-associated suicidal ideation, which warrants urgent clarification.

Over the past decade, obesity trends have reached epidemic standards. 1 In this context, the understanding of glucagon-like peptide-1 (GLP-1)–based mechanisms and related anorectic properties of GLP-1 receptor agonists (RAs) have revolutionized the treatment of obesity. 2 In addition to enhancing glucose-dependent insulin release, GLP-1 RAs may reduce glucagon secretion as well as gastric emptying. 2 Originally introduced for the treatment of type 2 diabetes, the effect of GLP-1 RAs on weight loss soon caught research attention. 3 The weight loss properties of liraglutide and semaglutide quickly went viral on social media, leveraging their promotion as lifestyle drugs not just for patients with diabetes 4 and leading to a global shortage. 5 Currently, it is estimated that approximately 10% of patients with type 2 diabetes in the US are prescribed GLP-1 RAs. 6 Accordingly, regulatory authorities over the world have urged health care professionals to direct available supplies to patients with type 2 diabetes who are inadequately managed with other medications over off-label prescriptions. 7 , 8 Despite the promising potential of GLP-1 RAs, serious concerns have been raised about their safety. 9 , 10 On July 3, 2023, a series of reports for suicidal or self-harming thoughts associated with liraglutide or semaglutide triggered an ongoing review by the European Medicines Agency (EMA). 11 Previously, in the approval trials, 9 of the 3384 patients treated with liraglutide (0.27%) had reported suicidal ideation compared with 2 of 1941 patients allocated to the placebo group (0.10%). 12 On the other hand, no patients using semaglutide for obesity had developed suicidal ideation, 13 , 14 and no mental health differences were observed in adolescents, with a lower percentage of participants in the semaglutide group than in the placebo group reporting psychiatric adverse events (7% vs 15%). 15

The EMA-led investigation might have a global impact, given that liraglutide and semaglutide are administered to more than 20 million people per year. 11 This investigation was expected to be completed in November 2023, but ultimately updated in April 2024 after requesting further clarifications. 16 In the meantime the British Medicines and Healthcare Products Regulatory Agency and the US Food and Drug Administration (FDA) also announced a similar investigation. 17 So far both EMA and FDA declared that they did not find any clear demonstration of a relationship between GLP-1 RAs and suicide based on the available evidence, although the FDA investigation is still ongoing. 18 , 19

Marketing companies stated that warnings about suicidal behavior and ideation are formally required for medications prescribed for chronic weight management affecting the central nervous system. 20 The first 2 pharmacovigilance studies on the topic only included partial data from the US, 21 , 22 and a report from the EMA pharmacovigilance database did not assess disproportionality. 23 Typically, patients with suicidality are excluded from clinical trials; therefore, the reports from clinical trials may be less precise in capturing the risk of suicidal or self-injurious adverse drug reactions (ADRs) in later practice. In this context, we aimed to assess suicidal and/or self-injurious ADRs associated with liraglutide or semaglutide at a global level, using a World Health Organization (WHO) database of individual case safety reports (ICSRs).

This case-control study was conceived as a disproportionality analysis of the WHO Vigibase, a consolidated tool for postmarketing surveillance. In the past, large-scale ICSR databases attracted interest for early detection and characterization of emerging safety issues. 24 - 26

All procedures and analyses adhered to the Uppsala Monitoring Centre (UMC) caveat agreement for reporting standards and were in accordance with the Helsinki declaration for ethical principles in medical research. As the data were anonymized and all analyses were descriptive, an ethical review from the Zurich Cantonal ethics board was not required. Additionally, per the Common Rule, because all data in the database are anonymized, patient informed consent was not required. This study was reported according to The Reporting of A Disproportionality Analysis for Drug Safety Signal Detection Using Individual Case Safety Reports in PharmacoVigilance (READUS-PV) guideline. 27 - 29

We conducted a comprehensive search for reports of suicidal or self-injurious ADRs associated with liraglutide and semaglutide within the WHO global ICSRs database, which is the largest pharmacovigilance archive worldwide containing over 28 million reports of suspected ADRs from 140 member countries. On August 30, 2023, we selected all deduplicated ICSRs recorded in the database from inception. Reports for semaglutide were recorded between July 2011 and August 2023, whereas for liraglutide, reports were collected between November 2000 and August 2023. Database services are offered by the UMC, which manages the database. 30 , 31 Drugs recorded on the reports are coded using the WHO drug dictionary 32 and ADRs are classified according to the Medical Dictionary for Regulatory Activities (MedDRA), version 23.1. 33 Two authors (C.G. and G.S.) identified ADRs involving liraglutide and/or semaglutide as suspected or interacting drugs to identify any report of suicidal and/or self-injurious ADRs as classified in the “suicide/self-injury” standardized MedDRA query (SMQ) of the MedDRA classification. Such SMQs include any ADR related to suicidal and/or self-injurious thoughts and events. 34 Cases were all reports of suicidal and/or self-injurious ADRs, whereas controls were all other reports of suspected ADRs. We included liraglutide-related or semaglutide-related reports.

Descriptive statistics on demographic and clinical characteristics (in medians and IQRs) of reported cases were provided. We compared the percentage of female patients, age, and dose between patients prescribed GLP-1 RAs for different indications. We grouped indications into the following categories: diabetes, weight management, possible off-label indication, and others. The classification of indications is detailed in eTable 1 in Supplement 1 . Comparisons of demographic and clinical characteristics were performed using χ 2 tests, Wilcox, Kruskal-Wallis, or Fisher tests. We also explored psychiatric symptoms coreported with the suicidal and/or self-injurious ADRs of interest. Two-sided P values less than .05 were considered significant.

We performed disproportionality analysis using 2 consolidated measures when at least 3 reports were recorded: first, we estimated reporting odds ratio (ROR) 35 and the bayesian information component (IC), 36 with 95% CIs. We applied well-established thresholds to define signals of disproportionate reporting, that is, lower limit of the 95% CI greater than 1 and greater than 0 for ROR and IC, respectively. Additional details about disproportionality analysis are reported in the eMethods in Supplement 1 .

For suicidal and/or self-injurious ADRs that had a signal of disproportionate reporting in the main disproportionality analysis, we performed sensitivity analyses to explore potential confounders. The following sensitivity analyses were conducted: (1) selecting only cases with coreporting of use of antidepressants and (2) only cases with coreporting of use of benzodiazepines as a proxy of depressive and anxiety disorders that can increase the risk of suicidal and/or self-injurious behaviors and ideation. Moreover, we repeated the aforementioned analyses by (3) excluding reports with coreporting of antidepressants and (4) benzodiazepines. To additionally mitigate the risk of confounding by indication and channeling bias, we performed 3 other sensitivity analyses using other drugs prescribed for the same indications (obesity and type 2 diabetes) as comparators; specifically we selected (5) dapagliflozin, a sodium-glucose cotransporter 2 inhibitor, and (6) metformin, considering their well-established role in the treatment of type 2 diabetes coupled with their favorable impact on body weight 37 , 38 ; and (7) orlistat, considering its indication for obesity and weight loss.

We also assessed the disproportionality of reporting in female and male patients separately. Last, to assess the trend of reporting over time, we reported the number of reports by year for all ADRs of interest and for each ADR with disproportionate reporting. Data were analyzed from September to December 2023.

As of August 30, 2023, of the 36 172 078 total reports in the database, we identified a total of 107 (median [IQR] age, 48 [40-56] years; 59 female patients [55%]; median [IQR] treatment duration, 24.0 [2.3-61.0] days [data from 28 reports]) unique, deduplicated cases of suicidal and/or self-injurious ADRs associated with semaglutide (107 of the 30 527 total reports [0.35%]) and 162 (median [IQR] age, 47 [38-60] years; 100 female patients [61%]; median [IQR] treatment duration, 46.0 [14.0-99.0] days [data from 33 reports]) cases associated with liraglutide (162 of 52 131 total reports [0.31%]). Demographic and clinical characteristics of the cases are reported in Table 1 by GLP-1 RA.

Regarding indications for use, the main reason for prescription was a possible off-label use (34 patients for semaglutide [31.8%] and 55 for liraglutide [33.9%]), followed by weight management (28 for semaglutide [26.2%] and 40 for liraglutide [24.7%]), diabetes (26 for semaglutide [24.3%] and 33 for liraglutide [20.4%]), and in 1 case for polycystic ovary syndrome for each GLP-1 RA (1 for semaglutide [0.9%] and 1 for liraglutide [0.6%]). Semaglutide-associated cases were reported by consumers in almost half of the reports (52 cases [48.6%]). Liraglutide-associated cases were mainly reported by health professionals (103 cases [63.6%]).

Regarding outcomes following dechallenge and rechallenge, both for semaglutide (eTable 2 in Supplement 1 ) and liraglutide (eTable 3 in Supplement 1 ), suicidal ideation resolved after drug discontinuation in 62.5% of the cases. In the semaglutide-associated reports of suicidal and/or self-injurious ADRs, the most common comedications included antidiabetics (17 patients [15.9%]) and antidepressants (14 patients [13.1%]), with higher percentages for liraglutide (49 patients [30.3%] and 30 patients [18.5%], respectively) ( Table 1 ).

Suicidal and/or self-injurious ADRs associated with semaglutide and/or liraglutide are presented in Table 2 . Suicidal ideation, intentional overdose, and suicide attempt ranked highest for semaglutide (94 patients [88%], 7 patients [6.5%], and 7 patients [6.5%], respectively), whereas for liraglutide suicidal ideation, completed suicide, and suicide attempt ranked highest (116 patients [71.6%], 19 patients [11.7%], and 16 patients [9.9%], respectively). Seven reactions (6.5%) were fatal for semaglutide and 24 (14.8%) for liraglutide. In Table 2 we report the number and percentage of each ADR by indication. For reports of suicidal ideation, we reported older age for patients prescribed liraglutide for diabetes compared with off-label and weight management, as well as a lower percentage of female patients prescribed liraglutide for diabetes compared with off-label and weight management (eTable 4 in Supplement 1 ).

The list of coreported psychiatric symptoms for semaglutide and liraglutide is reported in eTable 5 and eTable 6 in Supplement 1 . For liraglutide there were 91 cases in which suicidal and/or self-injurious ADRs were reported without any other psychiatric symptoms, 50 cases in which 1 psychiatric symptom was coreported, and 21 in which 2 or more other psychiatric symptoms were reported. Table 2 shows the number of cases in which suicidal and/or self-injurious ADRs were reported alone by indication. In half of these cases the drug was taken off-label both for semaglutide and liraglutide ( Table 2 ).

We detected a significant disproportionality only for semaglutide-associated suicidal ideation compared with all medications (ROR, 1.45; 95% CI, 1.18-1.77; IC, 0.53; 95% CI, 0.19-0.78) ( Table 3 ). Data on duration for semaglutide treatment were available in 26 patients reporting suicidal ideation, with a mean (range) duration of 80.39 (0 to 610) days between semaglutide treatment initiation and suicidal ideation occurrence. We did not find signals for any other ADR of interest ( Table 3 ). Numbers of cases and controls are reported in eTable 7 and eTable 8 in Supplement 1 .

The first sensitivity analysis including cases with comedications with antidepressants showed a disproportionate reporting of semaglutide-associated suicidal ideation compared with all medications (ROR, 4.45; 95% CI, 2.52 to 7.86; IC, 1.96; 95% CI, 0.98 to 2.63). The second sensitivity analysis including cases with comedications with benzodiazepines showed a disproportionate reporting of semaglutide-associated suicidal ideation compared with all medications (ROR, 4.07; 95% CI, 1.69 to 9.82; IC, 1.67; 95% CI, 0.11 to 2.65). Reporting was not disproportionate when excluding reports with comedications with antidepressants (ROR, 1.28; 95% CI, 1.03 to 1.60; IC, 0.36; 95% CI, −0.10 to 0.63), but remained disproportionate when excluding reports with comedications with benzodiazepines (ROR, 1.40; 95% CI, 1.13 to 1.72; IC, 0.48; 95% CI, 0.13 to 0.73). The fifth sensitivity analysis showed a disproportionate reporting of semaglutide-associated suicidal ideation compared with dapagliflozin (ROR, 5.56; 95% CI, 3.23 to 9.60; IC, 0.70; 95% CI, 0.36 to 0.95). The sixth sensitivity analysis showed a disproportionate reporting of semaglutide-associated suicidal ideation compared with metformin (ROR, 3.86; 95% CI, 2.91 to 5.12; IC, 1.20; 95% CI, 0.94 to 1.53). The seventh sensitivity analysis showed a disproportionate reporting of semaglutide-associated suicidal ideation compared with orlistat (ROR, 4.24; 95% CI, 2.69 to 6.69; IC, 0.70; 95% CI, 0.36 to 0.95). The analysis of disproportionality for suicidal ideation in female and male patients separately yielded disproportionate reporting in men (ROR, 1.51; 95% CI, 1.09 to 2.08; IC, 0.58; 95% CI, 0.03 to 0.97), whereas in female patients, although the lower limit of the 95% CI of the ROR was greater than 1 (ROR, 1.35; 95% CI, 1.02 to 1.77) the lower limit of the 95% CI of the IC was less than 0 (IC, 0.42; 95% CI,−0.05 to 0.75) (eTable 9 in Supplement 1 ).

From marketization year till August 2023, there was a slight increase in the proportion of suicidal ADRs reported for both drugs. The increase was from 0% (2017) to 0.8% (2023) for semaglutide and from 0.09% (2014) to 0.4% (2023) for liraglutide (eTable 10 and eTable 11 in Supplement 1 ).

In this disproportionality analysis of the world’s largest ICSRs database using a case-control design, we found a significant disproportionality only for semaglutide-associated suicidal ideation compared with other medications. The number of reports showed a gradual increase over the years, which may indicate a widening therapeutic scope in obesity and accumulating clinical experience.

To our knowledge, no previous reports investigated the association between semaglutide and suicidal ideation using this database. In our sensitivity analyses, the disproportionality remained significant when focusing on coreported antidepressants or benzodiazepines, suggesting that people with anxiety and depressive disorders may be at higher probability of reporting suicidal ideation when medicated with semaglutide. When repeating the analysis after excluding cases in which antidepressants were coreported, we did not detect a disproportionality signal. In contrast, when repeating the analysis after excluding cases in which benzodiazepines were coreported, the disproportionality remained significant. This is consistent with an interaction between baseline psychopathology and semaglutide effects and warrants further investigation. Although EMA stated that no update to the product information is warranted, based on these findings, we believe that a precaution of use in patients with psychiatric disorders or psychological lability could be added in the semaglutide package insert. Remarkably, the FDA label of semaglutide for obesity warned to monitor for depression or suicidal thoughts. 39

One study using the FDA pharmacovigilance database suggested disproportionate reporting for suicidal ideation and suicidal depression for semaglutide and liraglutide, 21 whereas another study did not detect an association between suicidality and GLP-1 RAs. 22 Likewise, a cohort study using electronic health records did not detect higher risks of suicidal ideation in patients with obesity or diabetes treated with semaglutide compared with non–GLP-1 RAs. 40 Compared with this study, in our analysis we also included patients with potential off-label prescription of GLP-1 RAs. Thus, our analysis may be generalized to patients receiving GLP-1 RAs without a diagnosis of diabetes or obesity, thus further confirming the complementary nature of studies based on ICSRs disproportionality analysis and longitudinal observational design.

Evidence from bariatric studies suggests that a history of depression or anxiety is a predictor of suicide risk postoperatively, providing context for this interplay 41 ; authors discussed this association in light of the emerging frustration due to high expectations of bariatric surgery outcomes in patients with limited resources to deal with mental distress. 41 Of note, the pivotal trials of semaglutide in obesity had different exclusion criteria for mental disorders, such as major depressive disorder within 2 years before screening, diagnosis of severe psychiatric disorders, and history of suicide attempts. 13 , 15 An alternative hypothesis may consider very rapid weight loss related to adjustment problems such as inability to eat as expected and ultimately exacerbated mental distress in highly vulnerable patients. 41 Due to lack of data on GLP-1 RA–related changes of baseline weight or BMI, we could not test for either hypothesis.

Although comorbidity with depression is high in patients with diabetes, 42 the coreporting between antidepressants and antidiabetics was negligible. Furthermore, the signal of semaglutide remained when comparing with dapagliflozin, metformin, and orlistat, mitigating the risk of confounding by indication for diabetes and obesity. Therefore, patients with diabetes and/or obesity without psychiatric comorbidities may not be at high risk of semaglutide-associated suicidal ideation. Although ADR incidence cannot be calculated using the spontaneous reporting system, this ADR is likely to be rare and would probably not substantially alter the benefit-risk profile of semaglutide in approved therapeutic settings. However, the observed high proportion of cases due to possible off-label use and a recently published postmarketing signal of misuse or abuse 43 call for urgent clarification of patient-related and drug-related risk factors; cohort studies and large registries should be stratified by therapeutic indication, sex, and history of mental disorders, and data from off-label use should be retrieved.

Despite the large number of cases, we did not detect any signals for liraglutide-associated suicidal and/or self-injurious ADRs. Pooled data from phase 2 and 3 trials on liraglutide vs placebo for weight management identified a potential risk for suicidal ideation. 12 Nine of 3384 participants in the liraglutide group vs 2 of 1941 in the placebo group reported suicidal ideation or behavior during the trial (0.27% vs 0.10%). 12

The results of this study should be interpreted in light of several limitations. 25 First, we need to consider barriers to reporting and missing information. Second, the well-known inability to infer causality does not allow us to attribute any reactions to the effect of a drug. Third, the lack of denominator does not allow us to estimate the incidence of ADRs. Fourth, selection and collider bias, 44 as well as confounding by indication and channeling bias, although partially mitigated by our sensitivity analyses, may have played a role as people with treatment-resistant diabetes or obesity might reflect a subgroup of patients with more severe conditions including higher risk of mental distress. Additional adjustments for potential confounders, such as alcohol or substance misuse, were limited by the relatively small number of reports found. In the absence of more details about off-label prescribing, we were not able to further qualify the extent to which prescribing was off-label and its impact on the results. Fifth, the lack of treatment outcomes, such as weight change, did not allow different hypotheses to be considered. Sixth, the high proportion of cases with missing data on medication dose precluded a dose-response analysis. Seventh, because of the absence of information on the sociodemographic profile of the reporters, it is impossible to account for volunteer bias. Eighth, it is not possible to exclude the chance that, for instance, suicidal ideation may have preexisted. Ninth, data on treatment duration until ADR were provided only in a small number of reports. Additionally, since disproportionality measures are interdependent, the lack of statistically significant disproportionality should not be automatically interpreted as a safety endorsement. Several factors may influence the reporting pattern and the ability to detect disproportionality, including known and widely reported ADRs such as gastrointestinal ADRs.

Our findings are relevant to the general reader seeking up-to-date information. This relevance arises from the expectation that personal or anecdotal reports may continue to gain popularity on social media platforms without knowledge about risks. 45 One consequence of this trend may be the increase in off-label use of semaglutide, which is a public health concern that has led to the illegal trade in semaglutide pens, some of which are counterfeit. 46 Recently, a public warning about fake counterfeit semaglutide pens was issued in the United Kingdom and the US. 46 , 47 Considering the risk of suicidal ideation in people taking semaglutide off-label, authorities should consider issuing a warning to inform about this risk.

In this disproportionality study of an ADR database, we reported a disproportionality signal of suicidal ideation with semaglutide, but not for liraglutide, particularly among patients with coreported antidepressant use, a proxy for affective disorders (a notable exclusion criteria of premarketing clinical trials).

Accepted for Publication: May 21, 2024.

Published: August 20, 2024. doi:10.1001/jamanetworkopen.2024.23385

Open Access: This is an open access article distributed under the terms of the CC-BY License . © 2024 Schoretsanitis G et al. JAMA Network Open .

Corresponding Author: Georgios Schoretsanitis, MD, PhD, The Zucker Hillside Hospital, Behavioral Health Pavilion, 75-59 263rd St, Glen Oaks, NY 11004 ( [email protected] ).

Author Contributions: Drs Schoretsanitis and Gastaldon had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: All authors.

Acquisition, analysis, or interpretation of data: Schoretsanitis, Weiler, Raschi, Gastaldon.

Drafting of the manuscript: Schoretsanitis, Gastaldon.

Critical review of the manuscript for important intellectual content: All authors.

Statistical analysis: Schoretsanitis, Gastaldon.

Administrative, technical, or material support: Barbui.

Supervision: Schoretsanitis, Barbui, Raschi.

Conflict of Interest Disclosures: Dr Schoretsanitis reported receiving personal fees from HLS, Dexcel, Saladax, and Thermo Fisher outside the submitted work. Dr. Weiler reported being a member of the Human Medicines Expert Committee of Swissmedic. No other disclosures were reported.

Disclaimer: The views expressed in this article are the personal views of the authors and may not be understood or quoted as being made on behalf of or reflecting the position of the European Medicines Agency or one of its committees or working parties. While the authors used data from VigiBase, the World Health Organization (WHO) global database of individual case safety reports, as a source of information, the conclusions do not represent the opinion of the Uppsala Monitoring Centre (UMC) or the WHO.

Data Sharing Statement: See Supplement 2 .

Additional Contributions: The authors acknowledge the UMC, which provided and gave permission to use the data analyzed in the present study. The authors are also indebted to Dr Leonie Heron, PhD, Institute for Social and Preventive Medicine, University of Bern, Bern, Switzerland, who helped in editing this article. She was not compensated for her services.

  • Register for email alerts with links to free full-text articles
  • Access PDFs of free articles
  • Manage your interests
  • Save searches and receive search alerts
  • Share full article

Advertisement

Supported by

‘Worst-Case’ Disaster for Antarctic Ice Looks Less Likely, Study Finds

Global warming is putting the continent’s ice at risk of destruction in many forms. But one especially calamitous scenario might be a less pressing concern, a new study found.

A snowy and icy landscape with white clouds floating above it.

By Raymond Zhong

For almost a decade, climate scientists have been trying to get their heads around a particularly disastrous scenario for how West Antarctica’s gigantic ice sheet might break apart, bringing catastrophe to the world’s coasts.

It goes like this: Once enough of the ice sheet’s floating edges melt away, what remains are immense, sheer cliffs of ice facing the sea. These cliffs will be so tall and steep that they are unstable. Great chunks of ice start breaking away from them, exposing even taller, even more-unstable cliffs. Soon, these start crumbling too, and before long you have runaway collapse.

As all this ice tumbles into the ocean, and assuming that nations’ emissions of heat-trapping gases climb to extremely high levels, Antarctica could contribute more than a foot to worldwide sea-level rise before the end of the century.

This calamitous chain of events is still hypothetical, yet scientists have taken it seriously enough to include it as a “low-likelihood, high-impact” possibility in the United Nations’ latest assessment of future sea-level increase.

Now, though, a group of researchers has put forth evidence that the prospect may be more remote than previously thought. As humans burn fossil fuels and heat the planet, West Antarctica’s ice remains vulnerable to destruction in many forms. But this particular form, in which ice cliffs collapse one after the other, looks less likely, according to the scientists’ computer simulations.

“We’re not saying that we’re safe,” said Mathieu Morlighem, a professor of earth science at Dartmouth College who led the research. “The Antarctic ice sheet is going to disappear; this is going to happen. The question is how fast.”

We are having trouble retrieving the article content.

Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode please exit and  log into  your Times account, or  subscribe  for all of The Times.

Thank you for your patience while we verify access.

Already a subscriber?  Log in .

Want all of The Times?  Subscribe .

IMAGES

  1. business case study variance analysis

    case study on variance analysis

  2. business case study variance analysis

    case study on variance analysis

  3. Case Study #4 Analysis of Variance with Repeated Measures Template

    case study on variance analysis

  4. Analysis of Variance: Case Study

    case study on variance analysis

  5. Variance Analysis Report

    case study on variance analysis

  6. Variance Analysis Guide: 3 Examples in Budgets and Forecasts

    case study on variance analysis

VIDEO

  1. [統計學] 第十三講: Analysis of Variance

  2. Week 10: Lecture 85 : Analysis of Variance-I

  3. Case Study 4: One-Way ANOVA Demonstration

  4. ANOVA assumptions

  5. Variance Analysis: step by step guide: PMP Exam Preparation

  6. Case Study on Statistics|| NCERT

COMMENTS

  1. The Importance of Variance Analysis

    Faculty who use case studies should always include a case specific to variance analysis tools. Students who pursue careers in corporate finance will almost certainly be required to use such tools, particularly as data and predictive analytics applications are enhanced to improve forecasting accuracy. Two sources of such case studies are TRI ...

  2. A Case Study of a Variance Analysis Framework for Managing Distribution

    This paper presents a comprehensive variance analysis framework developed by supply-chain managers at Catalyst Paper Corporation as a tool for reporting and controlling distribution costs. The model decomposes the overall static-budget variance into four primary variance categories: volume, customer mix, distribution mix, and carrier charges.

  3. What is Variance Analysis: Types, Examples and Formula

    P&L (profit & loss) variance analysis is the process of comparing actual financial results to expected results in order to identify differences or variances. This type of variance analysis is typically performed on a company's income statement, which shows its revenues, expenses, and net profit or loss over a specific period of time.

  4. (PDF) VARIANCE ANALYSIS

    This chapter discusses variance analysis. A variance is the difference between planned, budgeted, or standard cost and actual costs. Variances can arise on both costs and revenues.

  5. Variance Analysis

    Variance analysis can be conducted for material, labor, and overhead. Direct Material Variances. Management is responsible for evaluation of variances. This task is an important part of effective control of an organization. ... Case Study. Blue Rail produces handrails, banisters, and similar welded products. The primary raw material is 40-foot ...

  6. Variance Analysis

    The Column Method for Variance Analysis. When calculating for variances, the simplest way is to follow the column method and input all the relevant information. This method is best shown through the example below: XYZ Company produces gadgets. Overhead is applied to products based on direct labor hours.

  7. Variance Analysis Impacting Company Financials

    The price variance, in this case, is unfavorable as the company ended up spending $0.50 more per pound than planned, totaling an additional $5,000 per month. This type of analysis prompts the company to either renegotiate supplier contracts or find alternative materials to mitigate such cost overruns in the future.

  8. (PDF) VARIANCE ANALYSIS FOR FINANCIAL PERFORMANCE

    Variance Analysis for financial performance. Variance analysis is a quantitative measure of performance that recognizes the numerical. disparities between the actual standard, the budgeted ...

  9. Budget Variance Analysis (Step-by-Step Guide)

    Step 1: Gather Data. The first step in performing a budget variance analysis is gathering data. You will need both the budgeted and actual financial data for the period you are analyzing. You may find the information you need in financial statements, invoices, receipts, sales reports, production reports, labor cost breakdowns, and other ...

  10. A Case Study of a Variance Analysis Framework for Managing Distribution

    Managing the distribution function as part of an overall supply-chain management strategy has become increasingly important given rising fuel costs in recent years. This paper presents a comprehensive variance analysis framework developed by supply-chain managers at Catalyst Paper Corporation as a tool for reporting and controlling distribution costs. The model decomposes the overall static ...

  11. Variance Analysis

    However, the actual hours worked to produce these units is 220 hours. To calculate labor efficiency variance: (Actual Labor Hours - Standard Hours) x Standard Labor Rate = Labor Efficiency ...

  12. PDF Advanced Variance Analysis: Calculation and Interpretation

    This will be illustrated in the case study of "Varicone Ltd." in this article. The rest of this article consists of a worked case study. This is designed to serve two purposes. First, it provides reminders and explanations of . some. of the 'advanced' variance formulas (specifically, the materials mix & yield and market share & market

  13. A Case Study of a Variance Analysis Framework for ...

    This case study by Gaffney, Gladkikh and Webb (2007) encompassed the development of a comprehensive variance analysis framework for reporting and controlling distribution costs. Furthermore, Botes ...

  14. Resources for Analysis of Variance > Introduction to Analysis of

    In this case study video, Analysis of Variance (ANOVA) is used to assess whether average student rent difffers by type of accommodation. The video was scripted by John Marriott (Royal Statistical Society Centre for Statistical Education) and presented by Dr Alun Owen (Loughborough University) and Steve Joiner and Nick Blenkin (both Coventry University).

  15. Analysis of Variance

    Abstract. Analysis of variance is a procedure that examines the effect of one (or more) independent variable (s) on one (or more) dependent variable (s). For the independent variables, which are also called factors or treatments, only a nominal scaling is required, while the dependent variable (also called target variable) is scaled metrically.

  16. Variance Analysis Case Study (HCA 240)

    Variance Analysis Case Study (Professor Chip Coon) Course. Health Care Accounting and Billing (HCA-240) 190 Documents. Students shared 190 documents in this course. University Grand Canyon University. Academic year: 2022/2023. Uploaded by: Teresa Winston. Grand Canyon University. 0 followers. 279 Uploads 88 upvotes.

  17. Case Study on Variance Analysis

    Variance Analysis Case Study: It is important to predict the whole sum of the difference in numbers in order to plan the further development of the company and improvement of its strategies due to these variances in expenditures. Variance analysis is extremely important for the small developing firms, because every extra sum of money is a plus ...

  18. Cost Variance in Project Management: How to Calculate It

    Cost variance analysis helps project managers identify exactly the point at which budgets begin to diverge from expectations. This allows them to pinpoint specific roadblocks and address them before the project gets derailed. This makes the concept of cost variance an essential part of project financial management.

  19. Examining Cost Measurements in Production and Delivery of Three Case

    Methods. This study made use of a cross-case method among 3 case studies using mixed methods, including horizontal budget variance calculation and qualitative interpretation of responses from course designers for budget variance using total quality management themes.

  20. Case Study On Standard Costing and Variance Analysis

    Case Study on Standard Costing and Variance Analysis - Free download as Word Doc (.doc / .docx), PDF File (.pdf), Text File (.txt) or read online for free. Supreme Ltd., an auto parts manufacturer, saw a 20% drop in net profits and 25% drop in net sales last quarter. The CEO directed the controller to present budgeted and actual performance for the current month.

  21. Standard Costing and Variance Analysis Case Study

    Variance analysis examines differences between actual and standard costs for direct materials, direct labor, and overhead. The document provides a case study and calculations to determine missing standard costs for a company, including the standard price per kg of direct material, standard quantity per unit, standard direct labor rate, standard ...

  22. Quantitative Data Analysis. A Complete Guide [2024]

    Case Study 1: Netflix's Data-Driven Recommendations Netflix extensively uses quantitative data analysis, particularly machine learning, to drive its recommendation engine. By mining user behavior data and combining it with metadata about movies and shows, they build predictive models to accurately forecast what a user would enjoy watching next.

  23. Exploring the impact of business analysis on corporate strategy: A case

    Adequate investment in business analysis would initiate market buoyancy and increase brand sustainable growth, as revealed in the case studies and statistics highlighted. The future of business analysis in Nigeria beams brightly; hence, it becomes an urgent course for business owners to adopt and business analysts to stand firm in the wake of ...

  24. 2024 U.S. Tech Experience Index (TXI) Study

    TROY, Mich.: 22 Aug. 2024 — Are vehicle owners becoming overwhelmed with technology features that don't solve a problem, don't work, are difficult to use or are just too limited in functionality? The results of the J.D. Power 2024 U.S. Tech Experience Index (TXI) Study,SM released today, suggest that could be the case. The study, which focuses on the user experience with advanced vehicle ...

  25. Disproportionality Analysis From World Health Organization Data on

    This case-control study was conceived as a disproportionality analysis of the WHO Vigibase, a consolidated tool for postmarketing surveillance. In the past, large-scale ICSR databases attracted interest for early detection and characterization of emerging safety issues. 24 - 26

  26. Factors Affecting Cultural Transmission in Museum Tourism: An Empirical

    In this study, the multivariate coefficient was 264.712 < 399(19 × 21), thus confirming multivariate normality distribution. The variance inflation factor (VIF) was used to evaluate multicollinearity among variables. All VIF values must be less than 3 (J. Hair et al., 2009). The VIFs ranged from 1.648 to 1.736 in this study.

  27. The Role of Variance Analysis as a Project Costs Controlling Tool

    Method: Quantitative with a case study approach. Result: Variance analysis can be used as a cost control tool for Road Rehabilitation projects during the Covid-19 pandemic.

  28. Long COVID symptoms and demographic associations: A retrospective case

    The long-term effects of COVID-19 are still being studied, and the incidence rate of LC may change over time. In the UK, studies have explored LC symptoms and risk factors in non-hospitalised individuals using primary care records 4 and consolidated evidence on persistent symptoms and their associations in broader populations. 5 Additionally, there has been significant interest in Patient ...

  29. Full article: Technical efficiency of maize production and their

    Technical efficiency of maize production and their determinants among smallholder farmers in Ethiopia: a case study in Sidama Region. Abdi Tumuri a Department of Economics, Hawassa University, ... Another important result in the analysis of the variance ratio parameter gamma (γ) was found to be significant at the p < 0.01 level expression. ...

  30. New Study Re-Evaluates 'Worst Case' Scenario for Thwaites Glacier

    A version of this article appears in print on , Section A, Page 10 of the New York edition with the headline: New Study Re-evaluates 'Worst Case' Scenario for Antarctica's Thwaites Glacier.