AI-driven design automation
AI-Driven Design Automation refers to the application of artificial intelligence (AI) techniques to automate and enhance various stages of the electronic design automation (EDA) process, particularly for integrated circuit (chip) and complex electronic system design. It has emerged as a significant field due to its potential to address the escalating complexity, cost, and time-to-market pressures in the semiconductor industry. AI-Driven Design Automation encompasses a range of methods, including machine learning, expert systems, and reinforcement learning, which are applied to tasks from architectural exploration and logic synthesis to physical design and verification. This article explores the historical evolution of AI in EDA, details its core methodologies, discusses key applications, and examines its impact on the design landscape and the semiconductor industry.
History
1980s–1990s: Expert systems and early experiments
The application of AI to design automation began to gain traction in the 1980s and 1990s, primarily through the development of expert systems. These systems aimed to capture the knowledge and heuristics of human design experts in the form of rules and inference engines to guide the design process.[1]
Notable early projects included the ULYSSES (Unified Layout Specification and Simulation Environment for Silicon) system from Carnegie Mellon University. ULYSSES was a CAD tool integration environment that allowed expert designers to codify design methodologies into executable scripts, treating CAD tools as knowledge sources managed by a scheduler.[2] The ADAM (Advanced Design AutoMation) system at the University of Southern California employed an expert system, the Design Planning Engine, to dynamically determine design control strategies and manage various design tasks, representing domain knowledge in frames.[3]
Other systems like DAA (Design Automation Assistant) demonstrated the use of rule-based approaches for specific tasks such as register-transfer level (RTL) design for systems like the IBM-370.[1] Research efforts at Carnegie Mellon University also produced TALIB, an expert system for mask layout using over 1200 rules, and EMUCS/DAA, for CPU architectural design with about 70 rules. These highlighted that AI approaches might be more readily applied to problems where a smaller set of rules could manage larger datasets.[4] Concurrently, silicon compilers such as MacPitts, Arsenic, and Palladio emerged, utilizing algorithmic methods and search schemes through abstract design spaces, representing another avenue for automation, though not always strictly based on expert systems.[4] Early experiments with neural networks in VLSI design also occurred during this period, although they were less prominent than rule-based systems.
2000s: Introduction of Machine Learning
The 2000s marked a revival of interest in AI for design automation, largely fueled by advancements in machine learning (ML) algorithms and the increasing availability of design and manufacturing data. ML techniques began to be applied to complex problems such as modeling and mitigating the effects of manufacturing variability in semiconductor devices, which became increasingly critical at smaller technology nodes. The growing volume of data generated throughout the chip design lifecycle provided the necessary foundation for training more sophisticated ML models, enabling predictive analytics and optimization in areas previously difficult to automate.
2016–2020: Reinforcement Learning and Large-Scale Initiatives
A significant turning point occurred in the mid-to-late 2010s, catalyzed by successes in other AI domains. The achievements of DeepMind's AlphaGo in mastering the game of Go inspired researchers to explore the application of reinforcement learning (RL) to complex EDA problems, which often involve vast search spaces and sequential decision-making.
In 2018, the U.S. DARPA launched the Intelligent Design of Electronic Assets (IDEA) program. A key objective of IDEA was to develop a “no-human-in-the-loop” layout generator capable of producing a manufacturable chip design from RTL specifications within 24 hours.[5] The OpenROAD project, a major initiative under IDEA led by UC San Diego in collaboration with industry and academic partners, aimed to create an open-source, autonomous toolchain leveraging machine learning, parallel search, and problem decomposition to achieve these goals.[5]
A prominent demonstration of RL's potential came from Google researchers between 2020 and 2021. They developed a deep reinforcement learning approach for chip floorplanning, which was reported to generate layouts superior or comparable to those produced by human experts in under six hours.[6] This method utilized a graph convolutional neural network and demonstrated the ability to learn transferable representations, improving its performance with exposure to more chip designs. The technology was subsequently used in the design of Google's Tensor Processing Unit (TPU) accelerators.[6]
2020s: Autonomous Systems and Agents
Entering the 2020s, the industry saw the commercial launch of autonomous AI-driven EDA systems. For example, Synopsys introduced DSO.ai™ (Design Space Optimization AI) in early 2020, presenting it as the industry's first autonomous artificial intelligence application for chip design.[7][8] This system utilizes reinforcement learning to search for optimization targets within the vast solution spaces of chip design, aiming to improve power, performance, and area (PPA).[8] By 2023, DSO.ai had been used in over 100 commercial tapeouts, indicating significant industry adoption.[9] Synopsys later expanded its AI offerings into a suite called Synopsys.ai, aiming to apply AI across the entire EDA workflow, including verification and test.[10][11]
These developments, integrating advanced AI techniques with cloud computing and extensive data resources, have prompted discussions about a new phase in EDA, sometimes referred to by industry commentators and participants as 'EDA 4.0'.[12][13] This era is characterized by the pervasive use of AI and machine learning to handle increasing design complexity, automate more aspects of the design process, and enable engineers to manage the massive volumes of data generated by EDA tools.[12][14] The goal of EDA 4.0 is to optimize product performance, reduce time-to-market, and streamline development and manufacturing processes through intelligent automation.[13]
AI Methods
Artificial intelligence techniques are increasingly employed to address complex challenges across the electronic design automation landscape. These methods analyze vast amounts of design data, learn intricate patterns, and automate decision-making processes, aiming to improve design quality, reduce turnaround times, and manage the growing complexity of semiconductor design. Key paradigms include supervised learning, unsupervised learning, reinforcement learning, and generative AI.
Supervised Learning
Supervised learning is a machine learning paradigm where algorithms are trained on labeled datasets.[15] This means that each input data point in the training set is accompanied by a known correct output or "label."[16] The algorithm learns to map inputs to outputs by identifying underlying patterns and relationships in the training data.[17] Once trained, the model can then make predictions or classifications on new, unseen data.[18]
In the context of EDA, supervised learning is broadly applicable to tasks where historical data can be used to predict future outcomes or identify specific conditions. This includes estimating various design metrics such as performance, power, and timing (as explored in works like Ithemal for CPU performance,[19] PRIMAL for RTL power,[20] and methods predicting path-based slack from graph-based analysis[21][22]) or classifying design elements to identify potential issues (such as lithography hotspots[23] or predicting routability[24]). Functionality-aware circuit representation learning also often employs supervised techniques.[25]
Unsupervised Learning
Unsupervised learning involves training algorithms on datasets without predefined labels, allowing the models to discover inherent patterns, structures, or relationships within the data on their own.[26] Common tasks include clustering (grouping similar data points), dimensionality reduction (reducing the number of variables while preserving important information), and association rule mining (discovering relationships between variables).[27]
In EDA, unsupervised methods are valuable for exploring complex design data to uncover insights that might not be apparent. For example, clustering techniques can be used to group design parameters or flow configurations, aiding in automatic design flow tuning (as explored by FIST).[28] A significant application is in representation learning, where the goal is to automatically learn meaningful and often lower-dimensional representations (features or embeddings) of circuit data. This can involve learning embeddings for analog circuit topologies using graph-based methods[29] or capturing the functional essence of netlists through contrastive learning techniques.[30]
Reinforcement Learning
Reinforcement learning (RL) is a type of machine learning where an agent learns to make optimal decisions by interacting with an environment over time.[31] The agent performs actions, transitions between states, and receives rewards or penalties as feedback, with the goal of maximizing a cumulative reward.[32] RL is distinct from supervised learning as it does not require labeled input/output pairs, and differs from unsupervised learning by focusing on goal-directed learning through trial and error.[33]
Within EDA, RL is particularly suited for sequential decision-making and optimization tasks in complex, high-dimensional design spaces. The adoption by commercial EDA solutions signifies its growing importance.[34] RL has been applied to physical design challenges like chip floorplanning, where an agent learns to arrange blocks to optimize objectives such as wirelength and performance.[35][36] In logic synthesis, RL can guide the selection and sequencing of optimization transforms to achieve better quality of results, as seen in approaches like AlphaSyn.[37] Gate sizing for timing optimization is another area where RL agents can learn effective policies.[38]
Generative AI
Generative AI refers to artificial intelligence models that are capable of creating new content, such as text, images, code, or other forms of data, rather than only analyzing or acting on existing data.[39] These models learn the underlying patterns and structures from the data they are trained on and then use this knowledge to produce novel outputs.[40]
In EDA, generative AI, especially through Large Language Models (LLMs) and other generative architectures like Generative Adversarial Networks (GANs), is finding diverse applications.
Large Language Models (LLMs)
Large Language Models are deep learning models, typically based on the transformer architecture, that are pre-trained on vast amounts of text and code.[41] They excel at understanding, summarizing,generating, and predicting human language and programming languages.[42] Their capabilities are being harnessed in EDA for tasks such as:
- RTL Code Generation: LLMs are used to automatically generate Hardware Description Language (HDL) code from specifications or prompts, with benchmarks like VerilogEval[43] and RTLLM[44] developed to assess these capabilities, and tools like AutoChip aimed at automating this process.[45]
- EDA Script Generation and Tool Interaction: LLM-based agents like ChatEDA can translate natural language commands into executable scripts for controlling EDA tools, potentially simplifying complex workflows.[46]
- Architectural Design and Exploration: LLMs assist in early-stage design by generating high-level synthesis code (e.g., GPT4AIGChip[47]), exploring design spaces for specialized hardware like Compute-in-Memory accelerators,[48] or helping to generate and review design specifications (e.g., SpecLLM[49]).
- Verification Assistance: LLMs are being explored for generating verification components like SystemVerilog Assertions (SVAs) from natural language specifications.
Other Generative Models
Beyond LLMs, other generative architectures such as Generative Adversarial Networks (GANs) play a role in EDA. GANs consist of two neural networks, a generator and a discriminator, that are trained simultaneously in a competitive process.[50] The generator learns to create data samples that mimic a training dataset, while the discriminator learns to distinguish between real and generated samples.[51] In physical design, GANs have been applied to tasks like generating sub-resolution assist features (SRAFs) to improve manufacturability in lithography (GAN-SRAF[52]) and for mask optimization (GAN-OPC[53]).
Applications
Artificial intelligence (AI) is being applied to numerous stages of the electronic design workflow, aiming to enhance efficiency, optimize results, and manage the increasing complexity of modern integrated circuits.[54] [55] [56] AI techniques assist designers from the initial architectural concepts through to manufacturing and testing.
High-Level Synthesis and Architectural Exploration
In the early phases of chip design, AI contributes to High-Level Synthesis (HLS) and system-level Design Space Exploration (DSE). These processes are fundamental for translating abstract specifications into concrete hardware architectures.[54] AI algorithms, often supervised learning techniques, are utilized to build surrogate models that can quickly estimate crucial design metrics like area, performance, and power for a multitude of architectural choices or HLS configurations.[54][56] This rapid estimation capability diminishes the need for protracted simulations, thereby enabling a more extensive exploration of potential designs.[54] For example, the Ithemal tool uses deep neural networks to estimate the throughput of basic code blocks, informing processor architecture choices.[19] Similarly, PRIMAL employs machine learning for power inference at the register-transfer level (RTL), providing early insights into power consumption.[20] Reinforcement learning (RL) and Bayesian optimization are also applied to guide the DSE process, helping to navigate the vast parameter space and identify optimal HLS directives or architectural parameters such as cache sizes.[54][55] LLMs are also being explored for generating architectural specifications or initial C-code for HLS, as seen with GPT4AIGChip.[47][56]
Logic Synthesis and Optimization
Logic synthesis is the process of transforming an RTL hardware description into an optimized gate-level netlist tailored for a specific manufacturing technology. AI methods assist in various aspects of this transformation, including logic optimization, technology mapping, and post-mapping refinements.[54][55] Supervised learning, particularly using Graph Neural Networks (GNNs) suited for circuit graph data, helps create models to predict design properties like power or error rates in approximate circuits.[54][56] These predictions then guide optimization algorithms. Reinforcement learning is employed to directly conduct logic optimization, for example, by training agents to select sequences of logic transformations to minimize area under timing constraints.[54][56] AlphaSyn uses Monte Carlo Tree Search with RL to optimize logic for area reduction.[37] FlowTune employs a multi-armed bandit strategy for selecting synthesis flows.[56] AI can also tune parameters for entire synthesis flows, learning from past designs to suggest optimal tool settings for new ones.[54]
Physical Design
Physical design converts the gate-level netlist into a geometric layout, dictating the physical arrangement and interconnection of circuit elements. AI is applied extensively in this domain to improve PPA metrics.[54][55]
Placement
Placement involves finding optimal locations for circuit blocks (macros) and standard cells. Reinforcement learning has been notably applied to macro placement, where an agent learns to position blocks to minimize wirelength and improve timing, as demonstrated by Google's work[35] and the GoodFloorplan approach.[36] Supervised learning models, including CNNs treating layouts as images, are used to predict routability issues like DRVs (e.g., RouteNet[24]) or post-routing timing directly from placement data.[54][56] RL-Sizer uses deep RL for optimizing gate sizes during placement to meet timing targets.[38]
Clock Network Synthesis
AI contributes to Clock Tree Synthesis (CTS) by optimizing the clock distribution network. GANs, sometimes combined with RL (e.g., GAN-CTS), are used to predict and refine clock tree structures, aiming to minimize clock skew and power consumption.[54][55][56]
Routing
Routing creates the physical wire connections. AI models predict routing congestion using techniques like GANs to guide routing algorithms.[54][56] RL is also employed to optimize the order in which nets are routed to reduce violations.[54]
Power/Ground Network Synthesis and Analysis
AI models, including CNNs and tree-based methods, assist in designing and analyzing the Power Delivery Network (PDN) by quickly estimating static and dynamic IR drop, thus guiding PDN synthesis and reducing design iterations.[54][55][56]
Verification and Validation
Verification and validation are crucial for ensuring a chip's correctness. AI is applied to make these often lengthy processes more efficient.[54] LLMs are used to translate natural language specifications into formal SystemVerilog assertions (SVAs) (e.g., AssertLLM)[56] and to aid in security verification.[56] Techniques for predicting path-based timing analysis results from graph-based analysis, such as those pioneered by Kahng et al.[21] and advanced with transformer models like TF-Predictor,[22] significantly speed up sign-off timing checks. DeepGate2 contributes functionality-aware circuit representation learning that can support verification tasks.[25]
Analog and Mixed-Signal Design
AI techniques are increasingly applied to the complex domain of analog and mixed-signal circuit design, aiding in topology selection, device sizing, and layout automation.[54][56] AI models, including Variational Autoencoders (VAEs) and RL, help explore and generate new circuit topologies.[56] For example, graph embeddings can be used for topology optimization of operational amplifiers.[29] Machine learning surrogate models provide fast performance estimations for device sizing, while RL directly optimizes device parameters.[56][54]
Test, Manufacturing, and Yield Optimization
AI contributes to post-silicon stages, including testing, design for manufacturability (DFM), and yield optimization.[54] In lithography, AI models like CNNs and GANs are employed for SRAF generation (e.g., GAN-SRAF[52]) and OPC (e.g., GAN-OPC[53]) to enhance chip printability. AI also predicts lithography hotspots from layouts, as demonstrated by Yang et al.[23] For broader design flow tuning related to manufacturing, FIST utilizes tree-based methods for parameter selection.[28]
Hardware-Software Co-design
Hardware-Software Co-design addresses the simultaneous optimization of hardware and software components. LLMs are emerging as tools to facilitate this, for instance, by assisting in the design of Compute-in-Memory (CiM) DNN accelerators where software mapping and hardware configuration are tightly coupled, as explored by Yan et al.[48][56] LLMs can also generate architectural specifications (e.g., SpecLLM[49]) or HDL code through benchmarks like VerilogEval[43] and RTLLM,[44] or with tools such as AutoChip.[45] Furthermore, LLM-based agents like ChatEDA facilitate interaction with EDA tools for various design stages.[46] For netlist representation, works like Wang et al. focus on learning functionality-aware embeddings.[30]
Industry Adoption and Ecosystem
The integration of artificial intelligence into electronic design automation is a broad trend, with various players in the semiconductor ecosystem contributing to and adopting these technologies. This includes EDA tool vendors who develop AI-powered software, semiconductor design companies and foundries that utilize these tools for chip creation and manufacturing, and hyperscalers or large system companies that may develop their own chips and AI-driven design methodologies.
EDA Tool Vendors
Major EDA companies are at the forefront of incorporating AI into their tool suites to address increasing design complexity. Their strategies often involve creating comprehensive AI platforms that apply machine learning across multiple stages of the design and manufacturing flow.
Synopsys offers a suite of tools under its Synopsys.ai initiative, which aims to improve design metrics and productivity from system architecture through to manufacturing.[57] A core component uses reinforcement learning to optimize for power, performance, and area (PPA) in the RTL-to-GDSII flow (DSO.ai). Other modules apply AI to accelerate verification coverage closure, optimize test pattern generation for manufacturing, and improve the design of analog circuits across various operating conditions.[57]
Cadence has developed its Cadence.AI platform, which it describes as employing "agentic AI workflows" to reduce design engineering time for complex SoCs.[58] Key platforms apply AI for digital design flow optimization (Cadence Cerebrus), verification productivity (Verisium), custom and analog IC design (Virtuoso Studio), and system-level analysis (Optimality Intelligent System Explorer).[58][59]
Siemens EDA focuses its AI strategy on optimizing its existing software engines and workflows to provide engineers with deeper design insights.[60] AI is used within its Calibre platform to accelerate manufacturing-related tasks like Design for Manufacturability (DFM), Resolution Enhancement Techniques (RET), and Optical Proximity Correction (OPC). AI is also applied in its Questa suite to speed up coverage closure in digital verification and in its Solido suite to reduce the characterization workload for analog designs.[60]
Semiconductor Design and FPGA Companies
Companies that design semiconductor chips, including FPGAs and adaptive SoCs, are significant adopters and developers of AI-enhanced EDA methodologies to streamline their design processes.
AMD provides a suite of tools for its adaptive hardware that incorporates AI principles. The AMD Vitis platform is an environment for developing designs on its SoCs and FPGAs, and includes a component, Vitis AI, with libraries and pre-trained models for AI inference acceleration.[61][62] The accompanying Vivado Design Suite uses machine learning techniques to improve quality-of-results (QoR) and aid in achieving timing closure and power estimation for the hardware design.[61]
NVIDIA operates a dedicated Design Automation Research group to explore novel EDA methods.[63] The group's focus includes GPU-accelerated EDA tools and the application of AI techniques like Bayesian optimization and reinforcement learning to EDA problems. An example of their research is AutoDMP, a tool that automates macro placement using multi-objective Bayesian optimization and a GPU-accelerated placer.[64]
Cloud Providers and Hyperscalers
Large cloud service providers and hyperscale companies play a dual role: they offer the scalable computational infrastructure crucial for running demanding AI and EDA workloads, and many also engage in designing their own custom silicon, often leveraging AI in these internal design processes.
Google Cloud, for instance, provides a platform supporting EDA workloads with scalable compute resources, specialized storage solutions, and high-performance networking.[65] Concurrently, Google's internal chip design teams have contributed to EDA research, notably by applying reinforcement learning to physical design tasks like chip floorplanning.[35]
IBM offers EDA-focused infrastructure on its cloud platform, with an emphasis on foundry-secure environments and high-performance computing.[66] Their solutions incorporate high-performance parallel storage and large-scale job management tools designed to help design houses manage the complex simulation and modeling workloads inherent in modern EDA.[66]
Limitations & Challenges
Data Quality and Availability
A primary challenge for the widespread and effective application of AI in EDA is the accessibility and quality of data.[54][56] Machine learning models, particularly deep learning approaches, typically require large, diverse, and high-quality datasets for training to ensure they can generalize well to new, unseen designs.[56] However, much of the detailed design data in the semiconductor industry is proprietary and highly sensitive, making companies reluctant to share it publicly. This scarcity of open, comprehensive benchmarks hinders academic research and the development of broadly applicable models.[54][56] Even when data is available, it may suffer from issues like noise, incompleteness, or imbalance (e.g., an overrepresentation of successful design iterations compared to problematic ones), which can lead to biased or poorly performing AI models.[55] The effort and cost associated with collecting, curating, and accurately labeling large-scale EDA datasets also pose significant hurdles.[56] Addressing these data-related challenges is crucial for advancing AI in EDA, with potential solutions including the development of robust data augmentation techniques, the generation of realistic synthetic data, and the creation of community-driven platforms for secure data sharing and benchmarking.[56]
Integration and Compute Cost
The practical deployment of AI solutions within the EDA domain faces significant hurdles related to integration with existing complex toolchains and the substantial computational costs involved.[54][55] Incorporating novel AI models and algorithms into established EDA workflows, which often consist of many interlinked tools and proprietary formats, requires considerable engineering effort and can encounter interoperability issues.[54][56] Furthermore, training and executing sophisticated AI models, especially deep learning architectures, demand significant computational resources, including powerful GPUs or specialized AI accelerators, large memory capacities, and extended processing times.[54][56] These requirements translate into increased operational costs for both model development and deployment.[55] The scalability of AI techniques to handle the ever-increasing size and complexity of modern chip designs, while maintaining computational efficiency and reasonable runtime memory, remains an ongoing challenge.[55][56]
Intellectual Property and Confidentiality
The use of AI in EDA, particularly when dealing with sensitive design data, raises critical concerns regarding the protection of intellectual property (IP) and maintaining data confidentiality.[54][56] Chip designs represent highly valuable IP, and there is inherent risk in exposing this proprietary information to AI models, especially if they are developed by third parties or operate on cloud-based platforms.[54] Ensuring that design data used for training or inference is not compromised, leaked, or used to inadvertently transfer proprietary knowledge is paramount.[56] While strategies such as fine-tuning open-source models on private datasets are being explored to mitigate some privacy concerns,[56] the establishment of secure data handling practices, robust access controls, and clear data governance frameworks is essential. The reluctance to share detailed design data due to these IP and confidentiality concerns also impedes collaborative research and the creation of more comprehensive AI models for the EDA industry.[54][56]
Human Oversight and Interpretability
Despite the drive towards automation, the role of human designers remains crucial, and challenges related to human oversight and the interpretability of AI models persist.[54][56] Many advanced AI models, particularly deep learning systems, can function as "black boxes," making it difficult for engineers to understand the reasoning behind their predictions or design choices.[54] This lack of transparency can be a barrier to adoption, as designers may be hesitant to trust or deploy solutions whose decision-making processes are opaque, especially in safety-critical applications or when debugging unexpected outcomes.[55] While AI can automate many tasks, human expertise is still indispensable for setting design goals, validating AI-generated results, handling novel or corner-case scenarios where AI models may falter, and providing the domain knowledge that often guides AI development.[54][56] The effective integration of AI into EDA necessitates a collaborative dynamic between human engineers and intelligent tools, requiring designers to develop new skills in working with and supervising AI systems.[56]
References
- ^ a b Parker, A. C.; Hayati, S. (1987). "Automating the VLSI design process using expert systems and silicon compilation". Proceedings of the IEEE. 75 (6): 777–785. doi:10.1109/PROC.1987.13806.
- ^ Bushnell, M.; Director, S. (1986). "VLSI CAD tool integration using the ULYSSES environment". Proceedings of the 23rd Design Automation Conference. pp. 55–61. doi:10.1109/DAC.1986.158594.
- ^ Granacki, J.; Knapp, D.; Parker, A. (1985). "The ADAM Advanced Design AutoMation System: Overview, planner and natural language interface". Proceedings of the 22nd Design Automation Conference. pp. 727–733. doi:10.1145/318012.318155.
- ^ a b Kirk, R. S. (December 1985). "The impact of AI technology on VLSI design". Proceedings of the International Workshop on Managing Requirements Knowledge. IEEE Computer Society. pp. 125–125. doi:10.1109/MARQWK.1985.10.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Ajayi, T.; Blaauw, D. (January 2019). "OpenROAD: Toward a self-driving, open-source digital layout implementation tool chain". Proceedings of the Government Microcircuit Applications & Critical Technology Conference (GOMACTech).
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Mirhoseini, A.; Goldie, A.; Yazgan, M. (2021). "A graph placement methodology for fast chip design". Nature. 594 (7862): 207–212. doi:10.1038/s41586-021-03544-w.
- ^ "Synopsys Advances State-of-the-Art in Electronic Design with Revolutionary Artificial Intelligence Technology". PR Newswire. 11 March 2020. Retrieved 7 June 2025.
- ^ a b "DSO.ai: AI-Driven Design Applications". Synopsys. Retrieved 7 June 2025.
- ^ Nitin Dahad (10 February 2023). "AI-Powered Chip Design Goes Mainstream". EE Times. Retrieved 7 June 2025.
- ^ Karl Freund (29 March 2023). "Synopsys.ai: New AI Solutions Across The Entire Chip Development Workflow". Forbes. Retrieved 7 June 2025.
- ^ "Synopsys.ai – Full Stack, AI-Driven EDA Suite". Synopsys. Retrieved 7 June 2025.
- ^ a b "Welcome To EDA 4.0 And The AI-Driven Revolution". Semiconductor Engineering. 1 June 2023. Retrieved 7 June 2025.
- ^ a b "EDA 4.0 And The AI-Driven Revolution". unipv.news (reporting on a Siemens presentation). 29 November 2023. Retrieved 7 June 2025.
- ^ Nitin Dahad (10 November 2022). "How AI-based EDA will enable, not replace the engineer". Embedded.com. Retrieved 7 June 2025.
- ^ "What Is Supervised Learning?". IBM. Retrieved 7 June 2025.
- ^ "What is Supervised Learning?". Google Cloud. Retrieved 7 June 2025.
- ^ "Supervised Learning". SAS Institute. Retrieved 7 June 2025.
- ^ "What Is Supervised Learning? - MATLAB & Simulink". MathWorks. Retrieved 7 June 2025.
- ^ a b Mendis, C.; Renda, A.; Amarasinghe, S.; Carbin, M. (May 2019). "Ithemal: Accurate, portable and fast basic block throughput estimation using deep neural networks". International Conference on Machine Learning. PMLR. pp. 4505–4515.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Zhou, Y.; Ren, H.; Zhang, Y.; Keller, B.; Khailany, B.; Zhang, Z. (June 2019). "PRIMAL: Power inference using machine learning". Proceedings of the 56th Annual Design Automation Conference 2019. pp. 1–6. doi:10.1145/3316781.3317868.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Kahng, A. B.; Mallappa, U.; Saul, L. (October 2018). "Using machine learning to predict path-based slack from graph-based timing analysis". 2018 IEEE 36th International Conference on Computer Design (ICCD). IEEE. pp. 603–612. doi:10.1109/ICCD.2018.00099.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Cao, P.; He, G.; Yang, T. (2022). "Tf-predictor: Transformer-based prerouting path delay prediction framework". IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems. 42 (7): 2227–2237. doi:10.1109/TCAD.2022.3226970.
- ^ a b Yang, H.; Luo, L.; Su, J.; Lin, C.; Yu, B. (2017). "Imbalance aware lithography hotspot detection: a deep learning approach". Journal of Micro/Nanolithography, MEMS, and MOEMS. 16 (3): 033504. doi:10.1117/1.JMM.16.3.033504.
- ^ a b Xie, Z.; Huang, Y. H.; Fang, G. Q.; Ren, H.; Fang, S. Y.; Chen, Y.; Hu, J. (November 2018). "RouteNet: Routability prediction for mixed-size designs using convolutional neural network". 2018 IEEE/ACM International Conference on Computer-Aided Design (ICCAD). IEEE. pp. 1–8. doi:10.1109/ICCAD.2018.8573507.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Shi, Z.; Pan, H.; Khan, S.; Li, M.; Liu, Y.; Huang, J. (October 2023). "Deepgate2: Functionality-aware circuit representation learning". 2023 IEEE/ACM International Conference on Computer Aided Design (ICCAD). IEEE. pp. 1–9. doi:10.1109/ICCAD57390.2023.10320806.
{{cite conference}}
: CS1 maint: date and year (link) - ^ "What is Unsupervised Learning?". Amazon Web Services. Retrieved 7 June 2025.
- ^ "What is unsupervised learning?". TechTarget. Retrieved 7 June 2025.
- ^ a b Xie, Z.; Fang, G. Q.; Huang, Y. H.; Ren, H.; Zhang, Y.; Khailany, B. (January 2020). "FIST: A feature-importance sampling and tree-based method for automatic design flow parameter tuning". 2020 25th Asia and South Pacific Design Automation Conference (ASP-DAC). IEEE. pp. 19–25. doi:10.1109/ASP-DAC47756.2020.9045421.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Lu, J.; Lei, L.; Yang, F.; Shang, L.; Zeng, X. (March 2022). "Topology optimization of operational amplifier in continuous space via graph embedding". 2022 Design, Automation & Test in Europe Conference & Exhibition (DATE). IEEE. pp. 142–147. doi:10.23919/DATE54114.2022.9774676.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Wang, Z.; Bai, C.; He, Z.; Zhang, G.; Xu, Q.; Ho, T. Y. (July 2022). "Functionality matters in netlist representation learning". Proceedings of the 59th ACM/IEEE Design Automation Conference. pp. 61–66. doi:10.1145/3489517.3530505.
{{cite conference}}
: CS1 maint: date and year (link) - ^ "Reinforcement Learning Terminology". Google Developers. Retrieved 7 June 2025.
- ^ "What is reinforcement learning (RL)? - Azure Machine Learning". Microsoft Learn. Retrieved 7 June 2025.
- ^ "Reinforcement Learning Bootcamp". Berkeley Artificial Intelligence Research (BAIR). 5 December 2018. Retrieved 7 June 2025.
- ^ "Synopsys. ai unveiled as industry's first full-stack, ai-driven eda suite for chipmakers". Synopsys. 2023. Retrieved 7 June 2025.
- ^ a b c Mirhoseini, A.; Goldie, A.; Yazgan, M.; Jiang, J. W.; Songhori, E.; Wang, S. (June 2021). "A graph placement methodology for fast chip design". Nature. 594 (7862): 207–212. doi:10.1038/s41586-021-03544-w.
- ^ a b Xu, Q.; Geng, H.; Chen, S.; Yuan, B.; Zhuo, C.; Kang, Y.; Wen, X. (2021). "GoodFloorplan: Graph convolutional network and reinforcement learning-based floorplanning". IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems. 41 (10): 3492–3502. doi:10.1109/TCAD.2021.3137018.
- ^ a b Pei, Z.; Liu, F.; He, Z.; Chen, G.; Zheng, H.; Zhu, K.; Yu, B. (October 2023). "AlphaSyn: Logic synthesis optimization with efficient monte carlo tree search". 2023 IEEE/ACM International Conference on Computer Aided Design (ICCAD). IEEE. pp. 1–9. doi:10.1109/ICCAD57390.2023.10320889.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Lu, Y. C.; Nath, S.; Khandelwal, V.; Lim, S. K. (December 2021). "Rl-sizer: Vlsi gate sizing for timing optimization using deep reinforcement learning". 2021 58th ACM/IEEE Design Automation Conference (DAC). IEEE. pp. 733–738. doi:10.1109/DAC18074.2021.9586135.
{{cite conference}}
: CS1 maint: date and year (link) - ^ "What is ChatGPT, DALL-E, and generative AI?". McKinsey & Company. 2 April 2024. Retrieved 7 June 2025.
- ^ "What is generative AI? An explainer". World Economic Forum. 12 January 2023. Retrieved 7 June 2025.
- ^ "What are Large Language Models? - LLMs Explained". Amazon Web Services. Retrieved 7 June 2025.
- ^ "What Is a Large Language Model (LLM)?". NVIDIA. Retrieved 7 June 2025.
- ^ a b Liu, M.; Pinckney, N.; Khailany, B.; Ren, H. (October 2023). "Verilogeval: Evaluating large language models for verilog code generation". 2023 IEEE/ACM International Conference on Computer Aided Design (ICCAD). IEEE. pp. 1–8. doi:10.1109/ICCAD57390.2023.10320803.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Lu, Y.; Liu, S.; Zhang, Q.; Xie, Z. (January 2024). "Rtllm: An open-source benchmark for design rtl generation with large language model". 2024 29th Asia and South Pacific Design Automation Conference (ASP-DAC). IEEE. pp. 722–727. doi:10.1109/ASP-DAC58473.2024.10468226.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Thakur, S.; Blocklove, J.; Pearce, H.; Tan, B.; Garg, S.; Karri, R. (2023). "Autochip: Automating hdl generation using llm feedback". arXiv:2311.04887.
- ^ a b Wu, H.; He, Z.; Zhang, X.; Yao, X.; Zheng, S.; Zheng, H.; Yu, B. (2024). "Chateda: A large language model powered autonomous agent for eda". IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems. doi:10.1109/TCAD.2024.3380985.
- ^ a b Fu, Y.; Zhang, Y.; Yu, Z.; Li, S.; Ye, Z.; Li, C. (October 2023). "Gpt4aigchip: Towards next-generation ai accelerator design automation via large language models". 2023 IEEE/ACM International Conference on Computer Aided Design (ICCAD). IEEE. pp. 1–9. doi:10.1109/ICCAD57390.2023.10320944.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Yan, Z.; Qin, Y.; Hu, X. S.; Shi, Y. (September 2023). "On the viability of using llms for sw/hw co-design: An example in designing cim dnn accelerators". 2023 IEEE 36th International System-on-Chip Conference (SOCC). IEEE. pp. 1–6. doi:10.1109/SOCC58487.2023.10278882.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b Li, M.; Fang, W.; Zhang, Q.; Xie, Z. (2024). "Specllm: Exploring generation and review of vlsi design specification with large language model". arXiv:2401.13266.
- ^ "What Is a Generative Adversarial Network (GAN)?". Oracle Corporation. Retrieved 7 June 2025.
- ^ "What Are Generative Adversarial Networks (GANs)?". Amazon Web Services. Retrieved 7 June 2025.
- ^ a b Alawieh, M. B.; Lin, Y.; Zhang, Z.; Li, M.; Huang, Q.; Pan, D. Z. (2020). "GAN-SRAF: subresolution assist feature generation using generative adversarial networks". IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems. 40 (2): 373–385. doi:10.1109/TCAD.2020.3002637.
- ^ a b Yang, H.; Li, S.; Ma, Y.; Yu, B.; Young, E. F. (June 2018). "GAN-OPC: Mask optimization with lithography-guided generative adversarial nets". Proceedings of the 55th Annual Design Automation Conference. pp. 1–6. doi:10.1145/3195970.3196011.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad Rapp, M.; Amrouch, H.; Lin, Y.; Yu, B.; Pan, D. Z.; Wolf, M.; Henkel, J. (2021). "MLCAD: A survey of research in machine learning for CAD keynote paper". IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems. 41 (10): 3162–3181. doi:10.1109/TCAD.2021.3124762.
- ^ a b c d e f g h i j k Gubbi, K. I.; Beheshti-Shirazi, S. A.; Sheaves, T.; Salehi, S.; PD, S. M.; Rafatirad, S. (June 2022). "Survey of machine learning for electronic design automation". Proceedings of the Great Lakes Symposium on VLSI 2022. pp. 513–518. doi:10.1145/3526241.3530834.
{{cite conference}}
: CS1 maint: date and year (link) - ^ a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad ae Chen, L.; Chen, Y.; Chu, Z.; Fang, W.; Ho, T. Y.; Huang, R. (2024). "The dawn of ai-native eda: Opportunities and challenges of large circuit models". arXiv:2403.07257.
- ^ a b "Synopsys.ai – Full Stack, AI-Driven EDA Suite" (PDF). Synopsys. Retrieved 7 June 2025.
- ^ a b "Cadence.AI: Transforming Chip Design with Agentic AI Workflows". Cadence Design Systems. Retrieved 7 June 2025.
- ^ "What is Electronic Design Automation (EDA)?". Cadence Design Systems. Retrieved 7 June 2025.
- ^ a b "A new era of EDA powered by AI". Siemens Digital Industries Software. Retrieved 7 June 2025.
- ^ a b "AMD Vivado Design Suite". AMD. Retrieved 7 June 2025.
- ^ "Vitis AI Developer Hub". AMD. Retrieved 7 June 2025.
- ^ "Design Automation Research Group". NVIDIA Research. Retrieved 7 June 2025.
- ^ "AutoDMP Optimizes Macro Placement for Chip Design with AI and GPUs". NVIDIA Developer Blog. 27 March 2023. Retrieved 7 June 2025.
- ^ "Scaling Your Chip Design Flow" (PDF). Google Cloud. Retrieved 7 June 2025.
- ^ a b "Leveraging IBM Cloud for electronic design automation (EDA) workloads". IBM. 31 October 2023. Retrieved 7 June 2025.
External links
The OpenROAD Project – Official website for the open-source autonomous layout generator.
Design Automation Conference (DAC) – Premier academic and industry conference for EDA.