Back to Resources
Article

The Gap in Power Electronics Design Tools, and How We're Closing It

How switchmode.io aims to solve the issues with the power supply design workflow.

February 17, 2026Philip Bassett
whitepaper

Every power electronics engineer knows the drill. You start a new converter design, and within hours you're juggling six disconnected tools, three spreadsheets, and a stack of application notes, trying to answer questions that should be straightforward but never are.


The problem every power engineer faces

Designing a power converter isn't one problem. It's dozens of interconnected problems: topology selection, component sizing, loss analysis, thermal management, magnetics design, control loop stability, EMI filtering, protection circuits. The answer to each one depends on all the others.

Change your switching frequency to reduce magnetics size and your MOSFET switching losses go up. Increase output capacitance to meet a transient spec and your control loop compensation needs redesigning. Add an input EMI filter and the impedance interaction might destabilise your converter. Swap to a different core material because your supplier can't deliver and your entire loss budget shifts.

This isn't a problem of missing knowledge. The analytical methods exist. State-space averaging, Steinmetz equation, Dowell's method, Middlebrook's criterion; these have been well-established for decades. The problem is that applying them requires dealing with disconnected tools that don't talk to each other, don't share operating point data, and force engineers into tedious manual iteration that consumes days of design time on every new project.

The current tooling landscape

Today's power electronics engineer works across several tiers of tools. Each has real value. None covers the full workflow.

Manufacturer calculators are the starting point for most designs. These free online tools are polished, accessible, and useful for initial sizing within a single manufacturer's ecosystem. Some are remarkably capable, generating complete schematics, BOMs, efficiency curves, and even Bode plots from a set of input specifications. But they exist to sell components. Every recommendation points to that manufacturer's parts. Try designing a converter with a controller from one vendor, MOSFETs from another, a third-party core, and ceramic capacitors from a fourth. You're back to spreadsheets. Engineers routinely replicate the same design across three or four vendor tools just to compare solutions, with no guarantee the tools are using equivalent assumptions.

Desktop design software occupies a more capable tier. These programs have spent decades encoding detailed analytical models, covering dozens of topologies with integrated magnetics design, control loop analysis, and cycle-by-cycle simulation. These tools are genuinely powerful, and the engineering knowledge they encapsulate is world-class. They represent the best of what's available for single-converter design, built by people who deeply understand the discipline.

But they carry the constraints of their era. Desktop installation, platform lock-in, upfront license costs in the thousands, and architectures rooted in spreadsheet engines or legacy frameworks. More fundamentally, they treat each converter as an isolated design. The engineer working on a multi-stage system must design each stage independently, manually extract impedance data from one stage, and feed it into the next. The tool doesn't see the system; only the engineer does.

Web-based design tools have started to appear, some with impressive feature lists covering loss analysis, magnetics design, loop analysis, reliability estimation, and BOM generation. On paper, they look comprehensive. In practice, many suffer from interfaces that are overwhelming to navigate, workflows that demand significant expertise just to get started, and fragmented feature sets that feel bolted together rather than designed as a coherent experience. Feature count alone doesn't solve the problem if the engineer spends more time fighting the tool than designing the converter.

Spreadsheets and in-house tools fill the gaps between the tiers above. Every experienced power engineer has a collection of Excel files accumulated over a career: a magnetics design spreadsheet here, a loop compensation calculator there, a loss estimation template from a previous project. These are flexible but fragile. They encode assumptions that made sense for one design but break silently on the next. They don't propagate changes. And they represent years of individual effort that isn't shared, standardised, or validated.

Circuit simulators are the gold standard for verification. They capture real switching behaviour, model non-idealities, and can sweep parameters across operating ranges. But simulation is verification, not design. Setting up a switching simulation from scratch takes hours. Getting it to converge on the right operating point takes more. And while simulation will faithfully tell you that your design doesn't work, it won't tell you why or suggest what to change.

The real cost isn't in any single tool. It's in the seams between them: the manual data transfer, the inconsistent assumptions, the operating point that drifted when you recalculated one stage but forgot to update the next. For multi-stage converter systems, this fragmentation becomes acute. An EMI filter, a PFC front end, an isolated DC-DC stage, and a post-regulator each affect the others through impedance interactions, but no existing tool analyses them as a connected system.

Vendor neutrality isn't a nice-to-have

It's worth dwelling on the manufacturer lock-in problem because it's more damaging than it first appears.

When a vendor tool recommends components, it's optimising within its own catalogue, not across the full market. A MOSFET from vendor A might have lower conduction loss, but a competing part from vendor B might have lower gate charge, giving better switching performance at your frequency. You'll never see that comparison in either vendor's tool.

Worse, some manufacturer tools use simplified or generic models for passive components. Engineers have reported building hardware from vendor tool outputs only to find instability, because the tool used template-based capacitor models that didn't capture the ESL or frequency-dependent ESR of the actual part selected. The design looked stable in the tool. It wasn't stable on the bench.

A design tool should be indifferent to who manufactured the components. The engineer's job is to find the best parts for their application, considering electrical performance, thermal behaviour, package availability, and cost. The tool's job is to evaluate candidates honestly, using models derived from real datasheet data rather than generic approximations.

The multi-stage problem nobody has solved

Single-stage converter design is reasonably well-served by existing tools, at least for common topologies. Desktop design software handles individual converters with impressive depth. Multi-rail DC power tree tools exist for selecting point-of-load regulators feeding FPGAs and processors from a common bus. But real-world high-power systems sit in a completely different category.

An EV on-board charger is a PFC stage cascaded with an isolated DC-DC. A telecom rectifier is an EMI filter, PFC, intermediate bus converter, and multiple point-of-load regulators. A satellite power system is a solar array regulator feeding a battery charge/discharge unit feeding multiple isolated converters. Industrial motor drives, medical power supplies, data centre power shelves: multi-stage architectures are the norm, not the exception.

The engineering challenges in multi-stage systems are qualitatively different from single-stage design. The efficiency chain means losses compound; three stages at 96% each give you 88.5% system efficiency, not 96%. Impedance interactions between stages can create instabilities that don't exist in any individual stage tested in isolation. The thermal environment is shared, with heat from one stage's magnetics raising the ambient for the adjacent stage's semiconductors. The input voltage range seen by a downstream stage depends on the regulation quality of the upstream stage.

No existing tool designs a complete multi-stage power system as an integrated analytical flow. Every multi-rail tool found in the market operates at the device-selection level, not the circuit design level. Engineers design each stage independently, then struggle to integrate them, often discovering interaction problems only at the prototype stage when it's expensive to fix.

What "pre-simulation" design actually needs

There's a critical phase in every converter design that sits between "back-of-envelope feasibility" and "switching simulation for verification." This is where the real design decisions happen: component selection, magnetics sizing, loss budgeting, stability assessment, thermal validation. Engineers spend the majority of their design time here, and it's exactly where the tool gap is widest.

What this phase needs isn't simulation. It's fast, analytical, interconnected calculation that lets an engineer explore the design space, understand trade-offs, and converge on a viable design point before committing to the time and effort of a full switching simulation.

The key word is interconnected. A loss calculation that doesn't feed into a thermal model is incomplete. A thermal model that doesn't feed back into temperature-dependent losses is inaccurate. A stability analysis that doesn't account for the impedance of the upstream filter stage is dangerous. A capacitor selection that meets the ripple spec but fails the transient spec is a costly bench surprise.

What engineers actually need is a single environment where changing a switching frequency automatically recalculates semiconductor losses, updates core loss in the magnetics, re-evaluates the thermal operating point, checks that the control loop bandwidth is still adequate, and verifies that the input filter interaction hasn't gone unstable. Not simulation-level accuracy, but analytical accuracy that's good enough to make the right design decisions, fast enough to enable exploration, and honest enough about its own limitations.

And it needs to be simple. Not dumbed down, but designed so that the interface stays out of the way and lets the engineer focus on the design. A tool with every feature imaginable is worthless if it takes an hour to find the right screen. Power electronics engineers are busy. The tool should respect their time.

Our approach: a system-level design canvas

Switchmode.io is a web-based design platform built specifically for multi-stage power converter systems. The interaction model is a visual canvas where the engineer builds a power system from functional blocks: input sources, output loads, converter topologies, EMI filters, bus connections, and heatsinks. Drag them on, connect them, configure them, run the analysis. The system architecture is visible at all times.

Each block on the canvas is a node containing component slots and parameters. A topology node holds the switching devices, magnetics, capacitors, gate driver configuration, and compensation network. A filter node holds the differential-mode and common-mode components. A heatsink node defines the thermal path to ambient. The connections between nodes carry both electrical and thermal relationships, because in a real system, two converter stages might be electrically in series but thermally coupled to the same heatsink, or electrically parallel but thermally isolated on separate boards.

This separation of electrical and thermal paths is deliberate. It lets the engineer model the actual physical arrangement of their system rather than forcing the thermal environment to follow the electrical topology. Three converter stages on one heatsink, a fourth on a separate cold plate, all electrically connected through a shared DC bus: the canvas captures that naturally.

Depth without complexity

The analytical depth lives inside the nodes, revealed progressively as the engineer needs it.

At the system level, the canvas shows overall efficiency, total losses, and thermal status across the full power chain. Each node displays a summary: stage efficiency, dominant loss mechanism, junction temperature. An engineer doing initial feasibility gets useful answers without clicking into anything.

Drill into a topology node and the view shifts to the stage level. Component slots show selected parts with their key parameters. Results show the efficiency curve, loss breakdown by component, thermal operating points, and stability margins. This is where most detailed design work happens.

Drill into a component slot and the view reaches the component level: full parametric detail, datasheet curves, temperature-dependent models, tolerance ranges. This is where the expert engineer fine-tunes a specific choice.

Drill into a result and the view opens the analysis level: loss waterfall charts, Bode plots, thermal impedance networks, worst-case bounds. The deepest level, accessed only when needed.

This layered approach means the tool scales with the engineer's intent. Initial sizing stays fast and clean. Detailed design has full access to every parameter and model. The complexity is always there; it's just not in the way until you need it.

Real parts, real data, any manufacturer

Unlike tools that work with abstract parameters or single-vendor catalogues, switchmode.io is built around real components from across the industry. Rather than typing "R_DS(on) = 10 mΩ" into a field, the engineer selects an actual part from a searchable database, and all model parameters populate automatically: temperature coefficients, parasitic capacitance curves, thermal impedance, gate charge profiles.

This matters because real components don't behave like their headline numbers. MOSFET on-resistance varies with gate voltage and temperature. Ceramic capacitor effective capacitance drops dramatically under DC bias. Core loss density depends on flux swing, frequency, and temperature in ways that a single Steinmetz coefficient doesn't capture. By modelling from real datasheet curves rather than simplified scalars, the analytical predictions stay closer to what the engineer will measure on the bench.

The component database is vendor-neutral by design. When you search for a 100V MOSFET, you see candidates from every manufacturer, ranked by actual performance in your circuit, not by who built the tool. The recommendation engine runs your loss and thermal models against each candidate and surfaces the components that give you the best system-level performance, not just the lowest number on a single parameter.

Wide bandgap devices need different models

The shift from silicon to GaN and SiC has changed what switching loss models need to capture. Traditional silicon MOSFET loss calculations based on voltage-current overlap during linear-mode switching don't apply to GaN HEMTs, where switching transitions are dominated by charge-based capacitive effects and there is no reverse recovery. SiC MOSFETs have their own subtleties: minimal but non-zero body diode recovery that can increase turn-on losses significantly at elevated temperatures, and dv/dt-induced false turn-on from Miller capacitance coupling.

Switchmode.io implements device-appropriate loss models rather than forcing all semiconductors through a single framework. For GaN and SiC devices in soft-switched topologies, the tool uses charge-based ZVS analysis rather than the traditional energy-based ½CV² approximation, which has been shown in the literature to give incorrect results for nonlinear output capacitances. For hard-switched applications, datasheet-parameter-based analytical models capture the gate charge profile, nonlinear C_oss, and device-specific switching characteristics without requiring simulation.

Exploration and comparison as a core workflow

Most design tools treat a converter design as a single fixed point. Change a parameter and the previous state is gone. Want to compare two MOSFETs? Write down the numbers from the first, swap the part, write down the numbers from the second, compare manually. Want to evaluate the trade-off between 200 kHz and 400 kHz switching frequency? Same tedious process.

Switchmode.io treats comparison as a first-class operation at three levels.

Parameter sweeps let the engineer vary a parameter across a range and see how every calculated result responds. Sweep switching frequency from 100 kHz to 500 kHz and watch efficiency, magnetics size, semiconductor losses, and thermal margins move together. Each point in the sweep is a complete, self-consistent design: the magnetics are re-optimised at every frequency, the thermal model recalculates, the loss budget rebalances. This isn't just plotting a curve; it's generating dozens of fully-worked designs and presenting the trade-off space.

Component comparison lets the engineer place multiple parts in the same component slot and see them evaluated side by side under identical operating conditions. Drop three MOSFETs into the high-side switch slot and the tool calculates conduction loss, switching loss, gate drive power, and junction temperature for each, in context. Not in a vacuum against a generic test circuit, but in the actual converter the engineer is designing, at the actual operating point, with the actual thermal environment. This is what vendor-neutral comparison should look like.

Topology branching lets the engineer explore fundamentally different design directions without losing work. Using a version control model similar to software development, the engineer can branch a design, change the topology from a flyback to a forward converter, or restructure the entire power chain, and compare the two branches against each other. Same input requirements, same output specifications, different architectures, quantitative comparison. This is the kind of exploration that currently requires building two completely separate designs in two separate files (or two separate tools) and manually aligning the comparison.

These three levels of comparison address the reality that power electronics design is rarely about finding the answer. It's about understanding trade-offs and making informed choices. A tool that only shows one design point at a time forces the engineer to hold the alternatives in their head. A tool that shows the design space helps the engineer think.

Design flow: from auto-design to expert override

A tiered design flow matters. An engineer doing initial feasibility needs a different interaction than one doing final optimisation.

At its simplest, the engineer specifies power requirements, selects a topology, and the tool produces a complete starting design with component values, compensation network, and magnetics sizing based on established design rules. These aren't arbitrary defaults; they encode practical heuristics like crossover frequency relative to switching frequency, RHP zero margin, current ripple ratios, and thermal derating that reflect real engineering practice.

From that starting point, the engineer can override any individual choice while the tool maintains consistency across everything else. Change the output capacitor and the compensation network updates. Select a different MOSFET and the thermal model recalculates. Push the switching frequency higher and see the immediate impact on magnetics size, semiconductor losses, and EMI filter requirements, all on the same screen.

The key difference from existing tools is that this flow operates at the system level, not the stage level. The engineer builds the full power chain on the canvas, and every change propagates through all connected stages. Adjusting the switching frequency of one converter automatically updates the impedance interaction analysis at every interface in the system.

The analytical engine

Loss analysis covers every significant mechanism: MOSFET conduction with temperature-dependent R_DS(on), switching loss via device-appropriate models (gate charge analysis for hard switching, charge-based ZVS analysis for soft switching), output capacitance (C_oss) energy, body diode reverse recovery, dead time conduction. Diode losses use the piecewise-linear model extracted from datasheet curves. Magnetic losses use iGSE (improved Generalised Steinmetz Equation) for core loss with DC bias correction, and Dowell's method for AC winding loss including proximity effects. Capacitor losses account for ESR variation with frequency and temperature, and MLCC DC bias derating. Gate driver power consumption. Snubber and clamp dissipation with selectable circuit topologies.

Thermal modelling builds complete thermal networks from junction to ambient, including heatsink convection (natural and forced), PCB thermal spreading for surface-mount packages with multi-layer board modelling, thermal via arrays, and radiation heat transfer which contributes 30-50% of total cooling at elevated temperatures. An iterative electro-thermal solver captures the coupling between temperature and loss, because R_DS(on) increases with temperature, which increases loss, which increases temperature.

Stability analysis provides the small-signal transfer functions for each converter stage (control-to-output, line-to-output, input impedance, output impedance) for voltage mode, peak current mode, and constant on-time control. It calculates compensator component values via the K-factor method, predicts phase margin and gain margin, flags right-half-plane zero bandwidth limitations for boost-derived topologies, and warns of subharmonic oscillation risk in current-mode designs. For multi-stage systems, it performs Middlebrook impedance analysis at every stage interface, automatically detecting potential instabilities that would only appear when stages are connected.

Worst-case and tolerance analysis answers the question that keeps power supply engineers awake at night: does this design still work at the edges of the component tolerance window? Feedback resistor tolerances affect output voltage accuracy. Capacitor tolerances and ageing shift loop stability margins. Inductor tolerance (which can reach 20-30% for powder cores) changes current ripple and CCM/DCM boundaries. Protection circuit thresholds drift with sense resistor and voltage divider tolerances. The tool propagates component tolerances through the full analytical model, showing the range of output voltage, stability margins, and protection thresholds across the worst-case operating envelope. This analysis is mandated for aerospace and automotive power supply qualification, and currently requires either manual spreadsheet work or expensive simulation setups.

Magnetics design covers custom inductor and transformer design including planar magnetics, with detailed layer-by-layer winding breakdowns. Core selection from manufacturer databases, turns optimisation, air gap calculation with fringing field estimation, and saturation margin verification across temperature range.

EMI filter design calculates required attenuation from switching waveform analysis, sizes differential-mode and common-mode filter components, verifies insertion loss against standards limits (CISPR 32, FCC Part 15, DO-160), and feeds the filter impedance directly into the stability analysis to verify the filter doesn't destabilise the converter.

Protection circuit design integrates overcurrent, overvoltage, soft-start, and inrush limiting calculations with the converter design. Rather than treating protection as an afterthought, the tool sizes protection components in the context of the converter's operating conditions and propagates their tolerances through the worst-case analysis. Sense resistor power dissipation feeds into the loss budget. Soft-start timing interacts with output capacitor charging and inrush current.

What this doesn't replace

We're deliberate about what switchmode.io is and isn't. It's not a circuit simulator. It uses averaged analytical models that are accurate to within 3-5% for system efficiency and 15-25% for individual loss mechanisms. That's good enough to make design decisions, not good enough to replace a switching simulation for final verification.

It won't catch every parasitic oscillation, every layout-dependent coupling, or every corner case in a complex modulation scheme. Switching simulators exist for that, and they do it well. What we provide is the rapid analytical exploration that should happen before simulation, so that when you do open your simulator, you're verifying a design that's already well-understood analytically, not debugging a design that was cobbled together from disconnected calculations.

The goal is to make the first prototype work. Not by replacing the engineer's judgement, but by giving them better information, faster, with fewer manual errors, so they can spend their time on the genuinely creative aspects of power electronics design rather than grinding through routine calculations.

Built on proven foundations, pushing accuracy further

The analytical models in switchmode.io come from the same textbooks and papers that the power electronics community has relied on for decades. We implement Vorpérian's PWM switch model, Ridley's current-mode control analysis, Middlebrook's impedance stability criterion, the Dowell and Ferreira winding loss models, McLyman's thermal correlations. The established canon of power electronics analysis, made accessible and interconnected.

Where more recent work improves on the classics, we adopt it. Unified CCM/DCM mode transition models that handle the boundary smoothly rather than switching discretely between two separate formulations. Corrected audio susceptibility predictions for current-mode control that improve accuracy in cascaded systems where input perturbation rejection matters. Describing function models for constant on-time control, a modulation scheme growing rapidly in adoption that most design tools don't model at all. Charge-based ZVS analysis for wide bandgap devices that corrects a systematic error in the traditional energy-based approach. Core loss models with DC bias correction that account for the 50-200% increase in losses that premagnetisation can cause in inductor applications.

Every model includes its accuracy expectations and limitations. We don't hide the assumptions; we make them visible. When a calculation is based on the iGSE with typical accuracy of ±10-20%, we say so. When a thermal estimate assumes natural convection that might not apply to your enclosure, we flag it. When the small-signal model is only valid below f_sw/10, we show where that boundary is.

Power electronics is a discipline where engineers rightly distrust black boxes. We aim to be a transparent tool that shows its working, not a magic answer machine that hides its assumptions.

Why web-native matters

Existing desktop tools work. They've proven the value of integrated analytical design for power converters. But the delivery model has limitations that a web-native approach eliminates.

No installation, no license keys, no platform restrictions. An engineer on a Mac, a Chromebook, or a corporate-locked Windows machine has the same access. Cross-device continuity means a design started at the office can be reviewed on a tablet. Updates ship instantly without download cycles or version fragmentation.

Client-side calculation means your designs stay in your browser. For engineers and companies concerned about proprietary IP, the computation happens locally, with optional cloud save for convenience and a local export mode for those who want full control over their data.

And the subscription model changes the economics. Instead of a multi-thousand-pound upfront license, access starts at a price point that individual engineers can expense without procurement approval. A lower barrier to entry means more engineers designing better power supplies.

Where we're headed

The immediate platform covers the core design loop: topology configuration, component sizing, loss and thermal analysis, stability verification, magnetics design, EMI filter integration, worst-case tolerance analysis, and the full comparison workflow. This addresses the daily workflow for the majority of power electronics design tasks.

Near-term development focuses on expanding topology coverage and analytical depth: additional topologies beyond the initial set, detailed gate driver design calculations, protection circuit sizing with tolerance propagation, snubber and clamp topology comparison with automated component selection, and advanced topology support including LLC resonant converters and dual active bridge (DAB) converters. PFC stage design will be added as a first-class topology, covering both traditional boost and totem-pole bridgeless architectures with THD estimation and hold-up time analysis.

The component database will grow continuously, with the aim of supporting automated part recommendation based on system-level optimisation rather than single-parameter filtering. The goal is to answer "which five MOSFETs give me the lowest total system loss at my operating point, in packages I can source?" rather than simply listing every part that meets a voltage and current threshold.

Simulation export will bridge the gap to verification tools, generating pre-configured netlists and schematics with all component values populated from the design, so engineers can move from analytical design to switching simulation without manually rebuilding their circuit.

Measurement data import will close the loop in the other direction. Engineers who have built hardware and measured loop gain, impedance, or efficiency should be able to overlay their bench data against the analytical predictions, verifying where the models are accurate and where real-world effects diverge. This builds confidence in the tool and helps engineers calibrate their intuition about when analytical models are sufficient and when simulation or measurement is needed.

Component stress derating and reliability estimation will extend the tool toward design-for-reliability, checking calculated component stresses against standard derating guidelines and estimating capacitor lifetime from ripple current and thermal conditions. For engineers working in aerospace, automotive, or medical applications, this bridges the gap between electrical design and qualification.

The bottom line

Power electronics design is analytically tractable. The equations exist. The methods are proven. The best existing tools have demonstrated the value of putting those equations into software. What's been missing is a platform that connects them into a coherent, intuitive workflow accessible from any device, works with real component data from any manufacturer, lets the engineer explore and compare design alternatives rather than evaluating one fixed point at a time, maintains consistency across the design, handles the multi-stage interactions that make real power systems challenging, and tells you whether your design still works at the edges of the tolerance window.

We're building that platform, one well-validated analytical model at a time.


Switchmode.io is currently in development. To be notified when early access is available, or to provide input on features that would be most valuable for your design workflow, visit switchmode.io.