Ibm_Condor

IBM Condor: A 1121-Qubit Scaling Milestone

IBM Quantum Condor Demonstrated (not public)

IBM's Condor processor marked a significant engineering achievement by demonstrating 1121 physical superconducting qubits, pushing the boundaries of single-chip quantum hardware scale.

IBM Superconducting transmon Demonstrated (not public) Physical qubits confidence: medium

From a data analyst's perspective, the IBM Quantum Condor processor represents a fascinating, albeit challenging, case study in quantum hardware development. Announced and internally demonstrated in December 2023, Condor achieved a monumental milestone: it was the first quantum processor to surpass the 1000-qubit threshold, specifically featuring 1121 physical superconducting transmon qubits. This achievement was not merely an incremental step but a significant leap in the raw scaling capability of quantum chips, signaling a new era in the engineering and fabrication of complex quantum systems. For analysts, this immediately flags Condor as a benchmark in terms of manufacturing prowess and the ability to integrate a high density of quantum elements on a single substrate.

However, the analytical utility of Condor is uniquely constrained by its status. Unlike many of IBM's other quantum processors, Condor was not deployed to the public IBM Quantum fleet. Its role was primarily as an internal demonstration and a testbed for extreme scaling. This means that critical performance metrics, such as detailed error rates, coherence times, and benchmark results, which are standard for evaluating publicly accessible systems, have not been released. This data scarcity presents a significant hurdle for any comprehensive performance analysis or direct comparison with other quantum hardware, whether from IBM or competing vendors. Analysts must therefore interpret Condor's significance through the lens of its stated purpose: a proof-of-concept for scale rather than a utility-optimized computational engine.

The context of Condor's development is crucial for understanding its place in the broader quantum computing landscape. IBM's quantum roadmap, particularly post-2023, has emphasized a strategic shift towards modular architectures and the optimization of qubit quality and error reduction, as exemplified by the Heron processor. Condor, while a triumph in sheer qubit count, represented the zenith of single-chip scaling efforts. Its development provided invaluable insights into the challenges and opportunities of fabricating such large-scale devices. For data analysts tracking the industry, Condor serves as a powerful indicator of the engineering limits and strategic pivots within leading quantum hardware providers, highlighting the ongoing tension between raw qubit count and the more practical metrics of qubit quality and system utility.

Understanding Condor requires a nuanced approach. While the headline figure of 1121 qubits is impressive, a data analyst must look beyond this singular metric. The absence of public access means there are no operational data streams, no job execution logs, and no user-generated benchmarks to analyze. This contrasts sharply with systems like IBM's Eagle or Heron, where extensive data on circuit execution, error mitigation performance, and user adoption is available. Therefore, Condor's profile is less about its computational throughput or error-corrected performance, and more about its role in advancing the fundamental engineering capabilities required for future, more powerful quantum systems. It's a testament to what's physically possible, informing the path toward fault-tolerant quantum computing, even if it doesn't directly contribute to today's accessible quantum computations.

Ultimately, Condor stands as a testament to the rapid pace of innovation in quantum hardware. Its existence confirms the feasibility of manufacturing quantum processors with over a thousand physical qubits, a threshold once considered distant. For data analysts, this implies that future systems, particularly those built on modular architectures, could leverage these scaling insights to achieve even greater qubit counts, potentially with improved quality. The challenge remains in translating these physical qubit counts into effective computational power, a task that requires not just more qubits, but better qubits with lower error rates and robust connectivity. Condor's legacy will likely be measured not by the computations it performed, but by the foundational knowledge it provided for the next generation of quantum hardware.

Key metrics

Physical qubits
1121
Number of physical qubits available for gate operations
Benchmark headline
2025
Benchmarks not publicly available (2025); internal demos only
Error-correction readiness
20/100
Heuristic score from topology + mode + error hints
Historical importance
30/100
Heuristic score from milestones + roadmap language
Native gates
SX | RZ | ECR
Gate alphabet you compile to
Connectivity
Heavy-hex lattice
Mapping overhead + routing depth sensitivity

Technical specifications

Spec Details
System ID IBM_CONDOR
Vendor IBM
Technology Superconducting transmon
Status Demonstrated (not public)
Primary metric 1121 physical qubits
Metric meaning Number of physical qubits available for gate operations
Qubit mode Gate-based with physical qubits; focused on scale testing
Connectivity Heavy-hex lattice
Native gates SX | RZ | ECR
Error rates & fidelities Not publicly confirmed (checked IBM blogs; no dated error rates released; aimed for low errors but prioritized quality over scale)
Benchmarks Benchmarks not publicly available (2025); internal demos only
How to access N/A
Platforms N/A
SDKs N/A
Regions N/A
Account requirements N/A
Pricing model N/A
Example prices N/A
Free tier / credits N/A
First announced 2023-12
First available 2023-12 (demo)
Major revisions None
Retired / roadmap Active internally; roadmap shifted to modular post-2023
Notes Announced but not in fleet; checked current resources without listing

System profile

The IBM Quantum Condor processor, with its 1121 physical superconducting transmon qubits, represents a significant engineering achievement in the realm of quantum hardware. From a data analyst's perspective, this primary metric of qubit count is a crucial indicator of the system's scale. A 'physical qubit' refers to an individual quantum bit on the chip, capable of holding quantum information and participating in gate operations. The sheer number of these qubits on a single chip pushed the boundaries of what was previously thought possible in terms of fabrication and integration density. This scale is particularly relevant for exploring the challenges associated with crosstalk, thermal management, and control signal routing in large quantum systems.

Technology and Qubit Mode: Condor utilizes superconducting transmon qubits, a mature and widely adopted technology in quantum computing. This gate-based architecture means that computations are performed through a sequence of quantum gates applied to these physical qubits. The focus for Condor was explicitly on 'scale testing,' implying that the primary objective was to demonstrate the feasibility of manufacturing and operating such a large number of qubits, rather than optimizing for immediate computational utility or achieving the lowest possible error rates. This distinction is vital for analysts, as it frames expectations regarding performance data.

Connectivity Topology: The processor features a heavy-hex lattice connectivity topology. This specific arrangement dictates how qubits are interconnected, influencing which qubits can directly interact via two-qubit gates. Heavy-hex lattices are known for offering a balance between connectivity and qubit density, and they are often considered for their potential benefits in error correction schemes due to their structured nature. For data analysts, understanding the topology is critical for assessing the types of quantum circuits that can be efficiently mapped onto the hardware and for predicting potential limitations in circuit depth or parallelism due to communication constraints.

Native Gates: Condor supports a set of native gates including SX, RZ, and ECR. These gates form a universal set, meaning that any arbitrary quantum operation can be decomposed into a sequence of these fundamental gates. The SX gate is a single-qubit gate, often referred to as a square-root-of-X gate, which is essential for qubit rotation. RZ gates are also single-qubit rotations around the Z-axis. The ECR (Echoed Cross Resonance) gate is a two-qubit entangling gate, crucial for creating entanglement between qubits. While the presence of a universal gate set is standard, the actual performance of these gates (i.e., their fidelity and speed) is paramount for effective quantum computation. However, for Condor, these specific performance metrics are not publicly available.

Error Rates and Fidelities: A significant data gap for Condor, from an analytical standpoint, is the absence of publicly confirmed error rates and fidelities. IBM's public statements and blogs, while acknowledging Condor's existence, have not released dated error rates for this specific processor. This is a critical piece of information for evaluating the practical utility of any quantum computer. Typically, error rates (e.g., single-qubit gate fidelity, two-qubit gate fidelity, readout error) are key indicators of a system's quality. The lack of this data for Condor suggests that while the system achieved unprecedented scale, its primary focus was not on delivering the lowest possible errors, a strategy that IBM later pursued with systems like Heron, which prioritized quality over raw qubit count for public deployment.

Benchmarks and Performance Limits: Consistent with its internal demonstration status, benchmarks for Condor are not publicly available. This means there are no published results from standard quantum benchmarks (e.g., quantum volume, randomized benchmarking, application-specific benchmarks) that would allow for a quantitative assessment of its computational power or comparison against other systems. Furthermore, operational limits such as shots, circuit depth, and duration are not applicable, as the system is not accessible to external users. There are no public queue limits or other operational constraints to analyze, as it functions purely as an internal research platform. This absence of operational data means that any performance analysis must remain theoretical, based solely on the architectural specifications.

What it is For and Trade-offs: IBM explicitly stated that Condor was developed for 'testing extreme qubit scaling and fabrication limits.' This clarifies its role as a research and development vehicle rather than a general-purpose quantum computer. The primary trade-off identified is 'High qubit count but potentially higher errors | Not optimized for utility like Heron.' This highlights a fundamental tension in quantum hardware development: pushing the boundaries of scale often comes with challenges in maintaining or improving qubit quality. For analysts, this implies that while Condor demonstrated the 'how many,' it did not necessarily answer the 'how well' or 'how useful' questions in the same way that publicly accessible, utility-focused systems aim to do. Its value lies in the engineering lessons learned, which can inform the design of future, more robust quantum processors.

In summary, while IBM Quantum Condor stands as a monumental achievement in quantum hardware scaling, its analytical profile is characterized by a wealth of architectural detail but a scarcity of performance data. Its significance is primarily in demonstrating the feasibility of large-scale qubit integration and informing IBM's strategic shift towards modular, high-quality systems. For data analysts, it serves as a powerful reminder that raw qubit count, while impressive, is only one dimension of a multi-faceted evaluation of quantum computing capability.

Generation lineage (family-level)
IBM Quantum Falcon  →  IBM Quantum Hummingbird  →  IBM Quantum Eagle  →  IBM Quantum Condor  →  IBM Quantum Heron (r3)
Heuristic chain based on common naming. Verify by revision/date for strict claims.
Related systems (same vendor)
Cross-system comparison (same vendor)
System Status Primary metric
IBM Quantum System Two (QS2) Active 399+ physical qubits (modular): 399+
IBM Quantum Heron (r2) Active 156 physical qubits: 156
IBM Quantum Heron (r3) Active 156 physical qubits: 156
IBM Quantum Heron (r1) Active 133 physical qubits: 133
IBM Quantum Eagle Active (limited) 127 physical qubits: 127
IBM Quantum Hummingbird Retired 65 physical qubits: 65

Access & pricing

How you access it
  • <b>No Public Access:</b> IBM Quantum Condor is not available for public use or experimentation.
  • <b>Internal IBM Use Only:</b> The processor was demonstrated internally and remains an exclusive research and development platform for IBM.
  • <b>Not on IBM Quantum Platform:</b> It is not part of the publicly accessible IBM Quantum Experience or Qiskit Runtime fleet.
  • <b>No SDK Support:</b> There is no direct support or integration for Condor within public SDKs like Qiskit.
  • <b>No Account Requirements:</b> As it is not publicly accessible, there are no account requirements for external users.
  • <b>Not Deployed in Public Regions:</b> Condor is not deployed in any publicly accessible cloud region.
  • <b>No How-to-Access Information:</b> Information on how to access or utilize Condor is not applicable for external parties.
  • <b>Primarily a Research Platform:</b> Its role is focused on internal R&D, informing future hardware designs rather than providing computational services.
How costs sneak up
  • <b>No Public Pricing:</b> There is no public pricing model or structure for IBM Quantum Condor.
  • <b>Not Available for Purchase/Lease:</b> The system is not offered for commercial purchase, lease, or pay-per-use by external entities.
  • <b>No Example Prices:</b> No example pricing data or cost estimates are available.
  • <b>Internal Cost Drivers:</b> Any associated costs are internal R&D expenses for IBM, not user-facing charges.
  • <b>No Free Tier or Credits:</b> Free tier access or quantum credits are not applicable to Condor.
  • <b>Pricing Details Irrelevant:</b> As the system is not publicly accessible, pricing details are not relevant for external users or analysts.

Status timeline

The journey of the IBM Quantum Condor processor is a compelling narrative of ambitious engineering, strategic pivots, and the relentless pursuit of quantum advantage. Its timeline, though relatively brief in terms of public visibility, encapsulates a pivotal moment in IBM's quantum roadmap and the broader industry's scaling efforts.

December 2023: First Announced and Internally Demonstrated. IBM officially announced the Condor processor in December 2023, simultaneously revealing its internal demonstration. This announcement was a landmark event, as Condor was the first quantum processor to publicly cross the 1000-qubit threshold, specifically featuring 1121 physical superconducting transmon qubits. This achievement immediately positioned IBM at the forefront of raw qubit count scaling. For data analysts, this date serves as the primary anchor for Condor's existence, marking its entry into the quantum hardware discourse. The internal demonstration aspect is crucial, indicating that while the hardware was operational, it was not immediately slated for public deployment.

Post-2023: Strategic Roadmap Shift. Following Condor's demonstration, IBM's quantum roadmap underwent a significant strategic refinement. While Condor represented the peak of single-chip scaling, the company's focus shifted towards modular quantum computing architectures and, critically, towards improving qubit quality and reducing error rates. This pivot was exemplified by the introduction of the Heron processor, which, while having fewer qubits than Condor (133 qubits), was designed with significantly lower error rates and higher utility for practical quantum computations. For analysts, this shift highlights a maturation in IBM's strategy, moving from a 'qubit count race' to a more holistic approach emphasizing the quality and interconnectivity of qubits for building truly useful quantum systems.

Active Internally; Roadmap Shifted: As of current information, Condor remains 'active internally' within IBM. This means it continues to serve as a valuable research and development asset, likely providing ongoing insights into the long-term stability, control, and fabrication challenges of large-scale quantum processors. However, its role in the public-facing roadmap has been superseded by modular designs. This implies that while Condor was a technical triumph, it was not intended to be a direct predecessor in a linear progression of publicly accessible, ever-larger single-chip systems. Instead, the knowledge gained from Condor is being integrated into the development of future modular systems, which aim to combine multiple smaller, high-quality chips into a larger, more powerful quantum computer.

No Major Revisions or Public Deployment: Throughout its known timeline, Condor has not undergone any major public revisions, nor has it been made available for external access. This consistent internal status underscores its role as a specialized research instrument rather than a production-ready system. For data analysts, this means there is no public evolution of the hardware to track, no performance improvements to monitor over time, and no user feedback loops to analyze. Its impact is therefore more indirect, influencing the design principles and engineering choices for subsequent generations of IBM's quantum hardware, particularly those focused on achieving fault tolerance through modularity and error correction.

In essence, Condor's timeline is a story of a singular, impressive achievement that served as a critical data point for IBM's strategic direction. It demonstrated the 'what if' of extreme single-chip scaling, providing the empirical foundation for IBM's subsequent focus on building quantum systems that prioritize not just the quantity of qubits, but their quality, connectivity, and ultimate utility in a modular framework. Its legacy is less about its direct computational output and more about its profound influence on the architectural choices and engineering priorities shaping the future of quantum computing.

What to verify next

  • Confirm if any specific research papers or academic collaborations have utilized data or insights directly from Condor experiments.
  • Seek any retrospective analysis or technical reports released by IBM detailing the engineering challenges and solutions encountered during Condor's development and operation.
  • Monitor IBM's official quantum roadmap updates for any mention of Condor's continued internal role or its direct influence on future modular system designs.
  • Investigate if any specific error rate or coherence time metrics, even if historical or aggregated, are ever released for Condor, which would significantly enhance its analytical profile.
  • Track industry discussions and expert opinions regarding the long-term implications of Condor's scaling achievement on the broader quantum hardware landscape.
  • Look for any comparisons or contrasts IBM might draw between Condor's scaling approach and the modular strategy being pursued with systems like Heron and future processors.

Sources

  • https://postquantum.com/industry-news/ibm-condor/
  • https://www.ibm.com/quantum/roadmap
  • https://research.ibm.com/blog/ibm-quantum-roadmap-2025
  • https://www.spinquanta.com/news-detail/discover-the-worlds-largest-quantum-computer-in20250106092507
  • https://quantum-computing.ibm.com/
  • https://qiskit.org/

Verification confidence: Medium. Specs can vary by revision and access tier. Always cite the exact device name + date-stamped metrics.

FAQ

What is IBM Quantum Condor?

IBM Quantum Condor is a superconducting quantum processor developed by IBM, notable for being the first to demonstrate over 1000 physical qubits, specifically 1121. It represents a significant engineering milestone in scaling quantum hardware.

Is IBM Condor publicly accessible?

No, IBM Condor is not publicly accessible. It was demonstrated internally by IBM in December 2023 and remains an internal research and development system, not part of the public IBM Quantum fleet.

What was the primary purpose of the Condor processor?

The primary purpose of Condor was to test the limits of single-chip qubit scaling and advanced fabrication techniques. It served as a crucial experiment in pushing the boundaries of physical qubit count, rather than being optimized for immediate utility or low error rates for public use.

How many qubits does IBM Condor have?

IBM Condor features 1121 physical superconducting transmon qubits. This metric refers to the raw number of individual qubits on the chip, distinct from logical qubits which incorporate error correction.

Are there any performance benchmarks or error rates available for Condor?

No, there are no public performance benchmarks or detailed error rate metrics available for IBM Condor. As an internal demonstration system, its detailed operational data has not been released, making direct performance comparisons challenging for external analysis.

How does Condor fit into IBM's overall quantum roadmap?

Condor represented the peak of IBM's single-chip scaling efforts. Following its demonstration, IBM's roadmap shifted focus towards modular architectures and improving qubit quality and error reduction, exemplified by systems like Heron, which prioritize lower error rates and higher utility over raw qubit count on a single chip.

What is the connectivity topology of Condor?

IBM Condor utilizes a heavy-hex lattice connectivity topology. This architecture dictates how qubits are interconnected, influencing gate operations and potential error correction schemes.

What are the native gates supported by Condor?

The native gates for Condor are SX, RZ, and ECR. These gates form a universal set, allowing for the construction of any quantum algorithm, though their specific implementation and fidelity are not publicly detailed for Condor.



Also in Quantum Computing

Xanadu X8

Xanadu X12

Xanadu Borealis

Subscribe