Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Research Highlight
  • Published: 26 July 2021

SILICON PHOTONICS

Co-packaged transceivers speed up

  • Christiana Varnava 1  

Nature Electronics volume  4 ,  page 455 ( 2021 ) Cite this article

285 Accesses

Metrics details

  • Engineering
  • Optical techniques

In Proc. 2021 IEEE Symposium on VLSI Technology (in the press); https://go.nature.com/3hL2gsH

Integrated optical input–output technologies are promising for high-speed communications because of their scaling and bandwidth advantages compared with electrical alternatives. Optical transceivers based on microring modulators can, for example, achieve high-throughput transmission using wavelength-division multiplexing. However, such transceivers have so far only demonstrated capacity up to 50 Gbit s –1 in the O-band (1,260 nm to 1,360 nm). Jahnavi Sharma, Hao Li and colleagues at Intel Corporation now show that a hybrid integrated transceiver based on photonic and electronic circuits can achieve a capacity up to 112 Gbit s –1 in the O-band with a pulse-amplitude modulation four-level scheme.

This is a preview of subscription content, access via your institution

Access options

Access Nature and 54 other Nature Portfolio journals

Get Nature+, our best-value online-access subscription

24,99 € / 30 days

cancel any time

Subscribe to this journal

Receive 12 digital issues and online access to articles

111,21 € per year

only 9,27 € per issue

Buy this article

  • Purchase on Springer Link
  • Instant access to full article PDF

Prices may be subject to local taxes which are calculated during checkout

Author information

Authors and affiliations.

Nature Electronics https://www.nature.com/natelectron

Christiana Varnava

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Christiana Varnava .

Rights and permissions

Reprints and permissions

About this article

Cite this article.

Varnava, C. Co-packaged transceivers speed up. Nat Electron 4 , 455 (2021). https://doi.org/10.1038/s41928-021-00628-3

Download citation

Published : 26 July 2021

Issue Date : July 2021

DOI : https://doi.org/10.1038/s41928-021-00628-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

recent research papers in vlsi

recent research papers in vlsi

VLSI-SoC: Design Trends

28th IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2020, Salt Lake City, UT, USA, October 6–9, 2020, Revised and Extended Selected Papers

  • Conference proceedings
  • © 2021
  • Andrea Calimera   ORCID: https://orcid.org/0000-0001-5881-3811 0 ,
  • Pierre-Emmanuel Gaillardon   ORCID: https://orcid.org/0000-0003-3634-3999 1 ,
  • Kunal Korgaonkar   ORCID: https://orcid.org/0000-0002-9078-2944 2 ,
  • Shahar Kvatinsky   ORCID: https://orcid.org/0000-0001-7277-7271 3 ,
  • Ricardo Reis   ORCID: https://orcid.org/0000-0001-5781-5858 4

Politecnico di Torino, Turin, Italy

You can also search for this editor in PubMed   Google Scholar

University of Utah, Salt Lake City, USA

Technion – israel institute of technology, haifa, israel, universidade federal do rio grande do sul, porto alegre, brazil.

Part of the book series: IFIP Advances in Information and Communication Technology (IFIPAICT, volume 621)

Included in the following conference series:

  • VLSI-SoC: IFIP/IEEE International Conference on Very Large Scale Integration - System on a Chip

Conference proceedings info: VLSI-SoC 2020.

19k Accesses

11 Citations

6 Altmetric

This is a preview of subscription content, log in via an institution to check access.

Access this book

  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

About this book

The 16 full papers included in this volume were carefully reviewed and selected from the 38 papers (out of 74 submissions) presented at the conference. The papers discuss the latest academic and industrial results and developments as well as future trends in the field of System-on-Chip (SoC) design, considering the challenges of nano-scale, state-of-the-art and emerging manufacturing technologies. In particular they address cutting-edge research fields like low-power design of RF, analog and mixed-signal circuits, EDA tools for the synthesis and verification of heterogenous SoCs, accelerators for cryptography and deep learning and on-chip Interconnection system, reliability and testing, and integration of 3D-ICs.

*The conference was held virtually.

Similar content being viewed by others

recent research papers in vlsi

Very Large Scale Integration (VLSI) and ASICs

recent research papers in vlsi

VLSI-SoC: An Enduring Tradition

recent research papers in vlsi

ITRS 2028—International Roadmap of Semiconductors

  • artificial intelligence
  • communication systems
  • computer hardware
  • computer-aided design
  • distributed computer systems
  • distributed systems
  • embedded systems
  • field programmable gate array
  • integrated circuits
  • microprocessor chips
  • network protocols
  • parallel processing systems
  • signal processing
  • telecommunication systems
  • vlsi circuits

Table of contents (16 papers)

Front matter, low-power high-speed adcs for adc-based wireline receivers in 22 nm fdsoi.

  • David Cordova, Wim Cops, Yann Deval, François Rivet, Herve Lapuyade, Nicolas Nodenot et al.

Mixed-Mode Signal Processing for Implementing MCMC MIMO Detector

  • Amin Aghighi, Behrouz Farhang-Boroujeny, Armin Tajalli

Low Power Current-Mode Relaxation Oscillators for Temperature and Supply Voltage Monitoring

  • Shanshan Dai, Caleb R. Tulloss, Xiaoyu Lian, Kangping Hu, Sherief Reda, Jacob K. Rosenstein

Fully-Autonomous SoC Synthesis Using Customizable Cell-Based Analog and Mixed-Signal Circuits Generation

  • Tutu Ajayi, Sumanth Kamineni, Morteza Fayazi, Yaswanth K. Cherivirala, Kyumin Kwon, Shourya Gupta et al.

Assessing the Configuration Space of the Open Source NVDLA Deep Learning Accelerator on a Mainstream MPSoC Platform

  • Alessandro Veronesi, Davide Bertozzi, Milos Krstic

SAT-Based Mapping of Data-Flow Graphs onto Coarse-Grained Reconfigurable Arrays

  • Yukio Miyasaka, Masahiro Fujita, Alan Mishchenko, John Wawrzynek

Learning Based Timing Closure on Relative Timed Design

  • Tannu Sharma, Sumanth Kolluru, Kenneth S. Stevens

Multilevel Signaling for High-Speed Chiplet-to-Chiplet Communication

  • Rakshith Saligram, Ankit Kaul, Muhannad S. Bakir, Arijit Raychowdhury

From Informal Specifications to an ABV Framework for Industrial Firmware Verification

  • Samuele Germiniani, Moreno Bragaglio, Graziano Pravadelli

Modular Functional Testing: Targeting the Small Embedded Memories in GPUs

  • Josie Esteban Rodriguez Condia, Matteo Sonza Reorda

RAT: A Lightweight Architecture Independent System-Level Soft Error Mitigation Technique

  • Jonas Gava, Ricardo Reis, Luciano Ost

SANSCrypt: Sporadic-Authentication-Based Sequential Logic Encryption

  • Yinghua Hu, Kaixin Yang, Shahin Nazarian, Pierluigi Nuzzo

3D Nanofabric : Layout Challenges and Solutions for Ultra-scaled Logic Designs

  • Edouard Giacomin, Juergen Boemmels, Julien Ryckaert, Francky Catthoor, Pierre-Emmanuel Gaillardon

3D Logic Cells Design and Results Based on Vertical NWFET Technology Including Tied Compact Model

  • Arnaud Poittevin, Chhandak Mukherjee, Ian O’Connor, Cristell Maneux, Guilhem Larrieu, Marina Deng et al.

Statistical Array Allocation and Partitioning for Compute In-Memory Fabrics

  • Brian Crafton, Samuel Spetalnick, Gauthaman Murali, Tushar Krishna, Sung-Kyu Lim, Arijit Raychowdhury

abstractPIM: A Technology Backward-Compatible Compilation Flow for Processing-In-Memory

  • Adi Eliahu, Rotem Ben-Hur, Ronny Ronen, Shahar Kvatinsky

Back Matter

Other volumes, editors and affiliations.

Andrea Calimera

Pierre-Emmanuel Gaillardon

Kunal Korgaonkar, Shahar Kvatinsky

Ricardo Reis

Bibliographic Information

Book Title : VLSI-SoC: Design Trends

Book Subtitle : 28th IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2020, Salt Lake City, UT, USA, October 6–9, 2020, Revised and Extended Selected Papers

Editors : Andrea Calimera, Pierre-Emmanuel Gaillardon, Kunal Korgaonkar, Shahar Kvatinsky, Ricardo Reis

Series Title : IFIP Advances in Information and Communication Technology

DOI : https://doi.org/10.1007/978-3-030-81641-4

Publisher : Springer Cham

eBook Packages : Computer Science , Computer Science (R0)

Copyright Information : IFIP International Federation for Information Processing 2021

Hardcover ISBN : 978-3-030-81640-7 Published: 15 July 2021

Softcover ISBN : 978-3-030-81643-8 Published: 15 July 2022

eBook ISBN : 978-3-030-81641-4 Published: 14 July 2021

Series ISSN : 1868-4238

Series E-ISSN : 1868-422X

Edition Number : 1

Number of Pages : XVIII, 364

Number of Illustrations : 70 b/w illustrations, 139 illustrations in colour

Topics : Computer Systems Organization and Communication Networks , Control Structures and Microprogramming , Input/Output and Data Communications , Information Systems Applications (incl. Internet)

  • Publish with us

Policies and ethics

Societies and partnerships

The International Federation for Information Processing

  • Find a journal
  • Track your research

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

electronics-logo

Journal Menu

  • Electronics Home
  • Aims & Scope
  • Editorial Board
  • Reviewer Board
  • Topical Advisory Panel
  • Instructions for Authors
  • Special Issues
  • Sections & Collections
  • Article Processing Charge
  • Indexing & Archiving
  • Editor’s Choice Articles
  • Most Cited & Viewed
  • Journal Statistics
  • Journal History
  • Journal Awards
  • Society Collaborations
  • Conferences
  • Editorial Office

Journal Browser

  • arrow_forward_ios Forthcoming issue arrow_forward_ios Current issue
  • Vol. 13 (2024)
  • Vol. 12 (2023)
  • Vol. 11 (2022)
  • Vol. 10 (2021)
  • Vol. 9 (2020)
  • Vol. 8 (2019)
  • Vol. 7 (2018)
  • Vol. 6 (2017)
  • Vol. 5 (2016)
  • Vol. 4 (2015)
  • Vol. 3 (2014)
  • Vol. 2 (2013)
  • Vol. 1 (2012)

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

VLSI Circuits & Systems Design

  • Print Special Issue Flyer
  • Special Issue Editors

Special Issue Information

  • Published Papers

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section " Circuit and Signal Processing ".

Deadline for manuscript submissions: closed (30 June 2022) | Viewed by 26290

Share This Special Issue

Special issue editor.

recent research papers in vlsi

Dear Colleagues,

The focus of this Special Issue is on the research challenges related to the design of emerging microelectronics and VLSI circuits and systems that meet the demanding specifications of innovative applications. This Special Issue considers challenges in the fields of low power consumption, small integration area, testing and security, without, however, being limited to them. Authors are encouraged to submit works related to emerging research topics and applications, such as hardware security, low-power IoT devices, high-performance processing cores, etc.

The topics of interest include, but are not limited to:

  • Device modeling
  • Emerging technologies
  • CAD for VLSI design
  • Hardware/software co-design
  • Testing and verification
  • FPGA-based design
  • Embedded systems
  • Low-power circuits and systems
  • Hardware security
  • Emerging applications
  • VLSI for AI and ML algorithms

Dr. Athanasios Kakarountas Guest Editor

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website . Once you are registered, click here to go to the submission form . Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (11 papers)

recent research papers in vlsi

Further Information

Mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

vlsi testing Recently Published Documents

Total documents.

  • Latest Documents
  • Most Cited Documents
  • Contributed Authors
  • Related Sources
  • Related Keywords

Embedded Radiation sensor with OBIST structure for applications in mixed signal systems

Oscillation based testing (OBT) has proven to be a simple and effective test strategy for numerous kind of circuits. In this work, OBT is applied to a radiation sensor to be used as a VLSI cell in embedded applications, implementing an oscillation built-in self-test (OBIST) structure. The oscillation condition is achieved by means of a minimally intrusive switched feedback loop and the response evaluation circuit can be included in a very simple way, minimizing the hardware overhead. The fault simulation indicates a fault coverage of 100% for the circuit under test.Keywords: fault simulation, mixed signal testing, OBIST, oscillation-based test, VLSI testing.

Implementation of a Parallel Fault Simulation System using PODEM in a Hardware Accelerator using Python

VLSI Testing is one of the essential domains in recent times. With the channel length of the transistor decreasing continually, the number of transistors in a chip increases, thus increasing the probability of defects or faults. Automatic Test Pattern Generator is one way to find such input test vectors to the circuit, which will help identify the faults if present. PODEM algorithm is one such algorithm used in this regard. This paper helps in reducing the runtime of this algorithm by the parallelism approach. Different stuck-at faults in the gate level circuit are simulated parallelly.

AI-Powered Terahertz VLSI Testing Technology for Ensuring Hardware Security and Reliability

A low-power true single phase clock scan cell design for vlsi testing, ai powered thz vlsi testing technology, methods of automated test solutions design for vlsi testing, covert gates: protecting integrated circuits with undetectable camouflaging.

Integrated circuit (IC) camouflaging has emerged as a promising solution for protecting semiconductor intellectual property (IP) against reverse engineering. Existing methods of camouflaging are based on standard cells that can assume one of many Boolean functions, either through variation of transistor threshold voltage or contact configurations. Unfortunately, such methods lead to high area, delay and power overheads, and are vulnerable to invasive as well as non-invasive attacks based on Boolean satisfiability/VLSI testing. In this paper, we propose, fabricate, and demonstrate a new cell camouflaging strategy, termed as ‘covert gate’ that leverages doping and dummy contacts to create camouflaged cells that are indistinguishable from regular standard cells under modern imaging techniques. We perform a comprehensive security analysis of covert gate, and show that it achieves high resiliency against SAT and test-based attacks at very low overheads. We also derive models to characterize the covert cells, and develop measures to incorporate them into a gate-level design. Simulation results of overheads and attacks are presented on benchmark circuits.

A heuristic fault based optimization approach to reduce test vectors count in VLSI testing

A comprehensive review on applications of don’t care bit filling techniques for test power reduction in digital vlsi systems.

Massive power consumption during VLSI testing is a serious threat to reliability concerns of ubiquitous silicon industry. A significant amount of low-power methodologies are proposed in the relevant literature to address this issue of test mode power consumption and don’t care bit(X) filling approaches are one of them in this fraternity. These don’t care(X) bit filling techniques have drawn the significant attention of industry and academia for its higher compatibility with existing design flow as neither modification of the CUT is required nor they need to rerun the time-consuming ATPG process. This paper presents an empirical survey of those X-bit filling techniques, applied to mitigate prime two types of dynamic power dissipation namely shift power and capture power, occurred during full scan testing.

VLSI Testing

Export citation format, share document.

recent research papers in vlsi

  • Technologies
  • Privacy Policy

Ieee Xpert ,Ieee Xpert, ieee vlsi , ns2 , matlab , communication , java , dotnet , android , image processing projects titles 2016 2017 for mtech btech ece cse it mechanical final year students

Latest Research topics in vlsi design

Latest research topics in vlsi design.

VLSI PHD RESEARCH

If we narrow down our discussion to research in areas like electronics, electrical, computer science, artificial intelligence , wireless communication and related fields, which are the base of everything in this high-tech world. In these fields researchers have developed applications (aided with technology) for every field ranging from biomedical to aerospace and construction, which were nowhere related to electronics or even current.

As the research fields we are talking about are providing base to the developing world and providing it with reliable technologies which are being used in real time, the work of researcher becomes more wide starting with an idea to the realization of the idea in the real world in form of application or product.

To make a reliable and working model the idea of the VLSI design project ( i.e speech processing application, biomedical monitoring system etc) needs to be implemented and re-implemented, re-tested and improvised. The there are many development cycles and techniques available which eases up the implementation like:

  • Behavioral simulation
  • Software based model
  • Hardware Implementation (ASIC)
  • Programmable hardware (FPGA)
  • Co-simulation

Behavioral simulation is used at initial phase and it is not appropriate for testing the real time behavior of the system in actual environment as it is more close to systems behavior in ideal environment.

We can simulate the actual environment by using different software models (more like software models of channels used to test communication systems) but its capabilities are also limited to human capability to model the environmental conditions in mathematical equations and models.

All of us are familiar with ASIC, their high performance and hardwired implementation. These are good for final implementation but not for intermediate stages of implementation and testing. Nothing is better than ASIC for real time testing of analog  VLSI  circuits. But for digital circuits and DSP applications we have a better option of FPGA (Field Programmable Gate Array).

The hardware co-simulation is a good idea to test and monitor systems in real time. To get more details about  PhD thesis  in VLSI you can do online research or contact us.

latest Low power research topics in vlsi design

The Research Support Centre provides expert advice and support across the whole Engineering and Technical research lifecycle, from discovery through exploitation of technical and translational research. The centre has two primary functions:

  • i) to facilitate the delivery of the Engineering Sciences research strategy and to build partnerships andii) to bring together all the technical research management and support services for Students.

To achieve these goals the centre is made up of two inter-relating components. The Academic Research Support Centre consists of the Research Coordination Office, Platform Technologies team and a Translational Research Office. The Technical Research Support Centre is made up of the Joint Research Office.

The Research Support Centre encompasses a wide range of expertise and facilities. By coordinating these resources, we can provide researchers with a package of support that is integrated, high quality and streamlined – and clearly accountable.

Once a researcher has a proposal for high quality research that will benefit, they can access all the help and resources they need through one gateway. This includes support with the approval process and funding applications and help setting up technical trials.

VLSI PHD Projects

Our research interests cover low power processor architectures, low power circuit design techniques, analog and mixed signal circuit design, rapid prototyping of digital systems, reconfigurable processors, Digital arithmetic, advanced processor architectures, vlsi implementation of signal and image processing algorithms, testing verification, memory design, Embedded vlsi and asynchronous circuits.

Organization engaged with embedded commodity development and serving various business solutions such as

  • Embedded System Product Development,
  • Software services,
  • Android development,
  • Web development.

Description for “Ph.d guidance with project assitance” Ph.d/ M.Phil PROJECT ASSISTANCE We look forward to welcoming you to one of our “Research and Development Division” for all Ph.D., Research scholars. We will arrange you the following details for completing your Ph.d Degree

  • Any University Admission- We provides a step-to-step guide to completing the application form, and will help make the process as straight forward as possible.
  • Guide Arrangement
  • Survey Paper Preparation
  • Problem Identification –Problem Identification of Existing System.
  • Implementation in all domains
  • Mobile Ad hoc Networks
  • Wireless Networks
  • Image Processing
  • Grid Computing
  • Distributed Computing
  • Natural Language Processing
  • Cloud Computing
  • Soft Computing
  • Data Mining
  • Wireless Senor Networks

Delivering effective support on your Ph. D work:

Companies represents a simple and practical advice on the problems of getting started, getting organized with the working on Ph.D projects.

We make you understand the practicalities of surviving the ordeal. We just make you divide the huge task into less challenging pieces. The training includes a suggested structure and a guide to what should go in each section.

We afford complete support with real-time exposure in your Ph.D works in the field of VLSI. Our Mission drives us in the way of delivering applications as well as products with complete integrity, innovative & interesting ideas with 100% accuracy.

  • Assistance in ALL Stages of your PhD Research in VLSI from Topic Selection to Thesis Submission.
  • Creating 100% confident in submitting your thesis work.
  • Our experienced professionals support you in your research works.
  • Providing complete solutions for the Research Scholars in many advanced domains.

Technologies used in VLSI:

  • Modelsim 6.5b Simulator
  • Xilinx ISE 10.1 System generator

III. Quartus 11.1

  • Tanner v7 EDA tool

iii.        W-Edit

  • Microwind & DSCH v2

VII. P-spice

VIII. LT-spice

.        Spartan IIIe

  • Hardware Description Language

.         Verilog HDL

CORE AREA OF GUIDANCE:

  • Digital signal processing Vlsi
  • Image processing Vlsi

III.        Wireless Vlsi

  • Communication Vlsi
  • Testing Vlsi
  • Digital cmos Vlsi

VII.        low power Vlsi

VIII.        Core Vlsi

  • Memory Designs

PROJECT SUPPORT:

  • Confirmation Letter
  • Attendance Certificate

III. Completion Certificate

Preprocessing Work:

  • Paper Selection

Identifying the problem:

  • Screenshots

III.        Simulation Report

  • Synthesize Report

Report Materials:

  • Block Diagrams
  • Review Details

III.        Relevant Materials

  • Presentation
  • Supporting Documents
  • Software E-Books

VII.        Software Development Standards & Procedure – E-Book

Learning Exposure:

VIII.        Programming classes

  • Practical training
  • Project Design & Implementation

Publishing Support:

XII.        Conference Support

XIII.        Journal Support

XIV.        Guide Arrangements

Vlsi based projects like image processing projects, low power projects, matlab with vlsi projects , cryptography projects, OFDM projects, SDR projects, communication projects, zigbee projects, digital signal processing projects, and also protocol interfacing projects like uart ,i2c,spi projects.

Signal and Image processing projects can be simulated by using Modelsim 6.5b and synthesized by Xilinx 10.1 using Spartan IIIe fpga and by Quartus 11.1using altera de2 fpga. In image processing projects, the input image or video can be converted to coefficients using Matlab. Low power projects can be designed using Tanner, Microwind and spice tools.

We spotlights on imparting an overall exposure to the concept and design methodologies of all major aspects of vlsi engineering relevant to industry needs and ground-breaking thoughts with 100% pure accuracy.

latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design  low power testing bist latest research topics in vlsi design latest latest research topics in vlsi design area latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design latest research topics in vlsi design

1star

ASU hosts 2024 IEEE VLSI Test Symposium

by TJ Triolo | May 7, 2024 | News

A man accepts an award at a podium in a presentation room

Microelectronic chips have become ubiquitous in modern technology. Vehicles of all types, computers, smartphones, appliances, defense technology and more use microelectronic components, also known as semiconductor chips.

As more and more of society depends on microelectronics, ensuring their reliability becomes increasingly crucial to avoid issues ranging from personal computer failures to plane crashes. The microelectronics test field formed to address this problem and keep the devices the modern world depends on every day humming along smoothly.

To advance microelectronics testing and ensure professionals have the most up-to-date knowledge, the Institute of Electrical and Electronics Engineers , or IEEE, hosts the annual VLSI Test Symposium . Held on Arizona State University’s Tempe campus, the 2024 event marked the 42nd edition of the conference, the first time it took place at a university and the largest attendance numbers ever recorded.

Testing reception to new VLSI test ideas

Attendees from industry, government institutions and universities from around the world network with other microelectronics testing experts, showcase their discoveries in research papers published as part of the event proceedings, listen to keynote speeches and exchange ideas in various aspects of testing. Conference participants discuss testing capabilities for the latest developments in electronics designed with very large-scale integration , or VLSI.

VLSI devices use thousands to billions of transistors on a single chip and include those for artificial intelligence, or AI, 5G communications technology and more.

ASU electrical engineering faculty were heavily involved in the organization of the 2024 symposium. Sule Ozev , a professor of electrical engineering in the Ira A. Fulton Schools of Engineering at ASU, served as one of the general chairs for the event, leading its organization and the associated logistics.

Ozev, a faculty member in the School of Electrical, Computer and Energy Engineering , part of the Fulton Schools, has attended more than 20 VLSI Test Symposium events and has been involved in organizing them since 2005.

“I got introduced to the VLSI Test Symposium in 1997, when I was working on my doctoral thesis,” she says. “It has been a great conference to present cutting-edge research as well as discuss ideas and network with peers.”

Jennifer Kitchen and Jennifer Blain Christen , both Fulton Schools associate professors of electrical engineering, were also involved in organizing the conference as part of the Program Committee. In total, more than 20 Fulton Schools students and faculty members were involved in presenting their work in a variety of sessions throughout the symposium.

Beyond faculty and student involvement, the ASU Center for Semiconductor Microelectronics , or ACME, was among the event’s sponsors.

Breaking symposium boundaries

Janusz Rajski, a Program Committee member and vice president of engineering at Siemens Digital Industries Software , was impressed with ASU’s hosting of the VLSI Test Symposium. He says he admires Ozev and ASU Fulton Professor of Microelectronics Krishnendu Chakrabarty ’s research but had never visited one of ASU’s campuses before.

Rajski’s hope for the conference is that collaborative solutions come from the technical discussions and panels that can solve issues facing the semiconductor industry. He gave a keynote speech about challenges facing silicon microelectronics to ensure they’re designed to be testable to detect malfunctions and remain reliable and secure throughout their life cycles.

“The symposium has traditionally put a lot of emphasis on new ideas and innovation,” Rajski says. “Today there are many challenges facing the semiconductor industry, and we will need many good ideas to effectively solve them.”

Ozev concurs that networking and collaboration are among the symposium’s biggest benefits. She says electronics testing solutions company Teradyne announced three internships at the conference and gave student attendees priority on applying before the positions were posted publicly online.

“Networking is essential for collaboration as well as finding good employment opportunities,” Ozev says. “The conference is also the place many attendees get together face-to-face and discuss interesting research ideas.”

She says many attendees told her they were impressed by ASU’s research and students. Through National Science Foundation grants, Ozev secured funds for 24 students from the university to attend the conference who would not otherwise have the opportunity.

For her, the increased visibility and positive impression among peer institutions for ASU among the microelectronics test community makes the event a success.

“It is rare for an IEEE conference of this esteem to be held at a university campus,” Ozev says. “ASU is once again a leader.”

ECEE Highlights

Read more engineering stories in Full Circle

Learn more about the School of Electrical, Computer and Energy Engineering

Learn more about the Ira A. Fulton Schools of Engineering

Suggestions or feedback?

MIT News | Massachusetts Institute of Technology

  • Machine learning
  • Social justice
  • Black holes
  • Classes and programs

Departments

  • Aeronautics and Astronautics
  • Brain and Cognitive Sciences
  • Architecture
  • Political Science
  • Mechanical Engineering

Centers, Labs, & Programs

  • Abdul Latif Jameel Poverty Action Lab (J-PAL)
  • Picower Institute for Learning and Memory
  • Lincoln Laboratory
  • School of Architecture + Planning
  • School of Engineering
  • School of Humanities, Arts, and Social Sciences
  • Sloan School of Management
  • School of Science
  • MIT Schwarzman College of Computing

Ultrasound offers a new way to perform deep brain stimulation

Press contact :.

Closeup microscopic view of a device focusing on 3 golden prongs emanating from a purple circular shape against a green backdrop

Previous image Next image

Deep brain stimulation, by implanted electrodes that deliver electrical pulses to the brain, is often used to treat Parkinson’s disease and other neurological disorders. However, the electrodes used for this treatment can eventually corrode and accumulate scar tissue, requiring them to be removed.

MIT researchers have now developed an alternative approach that uses ultrasound instead of electricity to perform deep brain stimulation, delivered by a fiber about the thickness of a human hair. In a study of mice, they showed that this stimulation can trigger neurons to release dopamine, in a part of the brain that is often targeted in patients with Parkinson’s disease.

“By using ultrasonography, we can create a new way of stimulating neurons to fire in the deep brain,” says Canan Dagdeviren, an associate professor in the MIT Media Lab and the senior author of the new study. “This device is thinner than a hair fiber, so there will be negligible tissue damage, and it is easy for us to navigate this device in the deep brain.”

Video thumbnail

In addition to offering a potentially safer way to deliver deep brain stimulation, this approach could also become a valuable tool for researchers seeking to learn more about how the brain works.

MIT graduate student Jason Hou and MIT postdoc Md Osman Goni Nayeem are the lead authors of the paper, along with collaborators from MIT’s McGovern Institute for Brain Research, Boston University, and Caltech. The study appears today in Nature Communications .

Deep in the brain

Dagdeviren’s lab has previously developed wearable ultrasound devices that can be used to deliver drugs through the skin or perform diagnostic imaging on various organs . However, ultrasound cannot penetrate deeply into the brain from a device attached to the head or skull.

“If we want to go into the deep brain, then it cannot be just wearable or attachable anymore. It has to be implantable,” Dagdeviren says. “We carefully customize the device so that it will be minimally invasive and avoid major blood vessels in the deep brain.”

Deep brain stimulation with electrical impulses is FDA-approved to treat symptoms of Parkinson’s disease. This approach uses millimeter-thick electrodes to activate dopamine-producing cells in a brain region called the substantia nigra. However, once implanted in the brain, the devices eventually begin to corrode, and scar tissue that builds up surrounding the implant can interfere with the electrical impulses.

The MIT team set out to see if they could overcome some of those drawbacks by replacing electrical stimulation with ultrasound. Most neurons have ion channels that are responsive to mechanical stimulation, such as the vibrations from sound waves, so ultrasound can be used to elicit activity in those cells. However, existing technologies for delivering ultrasound to the brain through the skull can’t reach deep into the brain with high precision because the skull itself can interfere with the ultrasound waves and cause off-target stimulation.

“To precisely modulate neurons, we must go deeper, leading us to design a new kind of ultrasound-based implant that produces localized ultrasound fields,” Nayeem says. To safely reach those deep brain regions, the researchers designed a hair-thin fiber made from a flexible polymer. The tip of the fiber contains a drum-like ultrasound transducer with a vibrating membrane. When this membrane, which encapsulates a thin piezoelectric film, is driven by a small electrical voltage, it generates ultrasonic waves that can be detected by nearby cells.

“It’s tissue-safe, there’s no exposed electrode surface, and it’s very low-power, which bodes well for translation to patient use,” Hou says.

In tests in mice, the researchers showed that this ultrasound device, which they call ImPULS (Implantable Piezoelectric Ultrasound Stimulator), can provoke activity in neurons of the hippocampus. Then, they implanted the fibers into the dopamine-producing substantia nigra and showed that they could stimulate neurons in the dorsal striatum to produce dopamine.

“Brain stimulation has been one of the most effective, yet least understood, methods used to restore health to the brain. ImPULS gives us the ability to stimulate brain cells with exquisite spatial-temporal resolution and in a manner that doesn’t produce the kind of damage or inflammation as other methods. Seeing its effectiveness in areas like the hippocampus opened an entirely new way for us to deliver precise stimulation to targeted circuits in the brain,” says Steve Ramirez, an assistant professor of psychological and brain sciences at Boston University, and a faculty member at B.U.’s Center for Systems Neuroscience, who is also an author of the study.

A customizable device

All of the components of the device are biocompatible, including the piezoelectric layer, which is made of a novel ceramic called potassium sodium niobate, or KNN. The current version of the implant is powered by an external power source, but the researchers envision that future versions could be powered a small implantable battery and electronics unit.

The researchers developed a microfabrication process that enables them to easily alter the length and thickness of the fiber, as well as the frequency of the sound waves produced by the piezoelectric transducer. This could allow the devices to be customized for different brain regions.

“We cannot say that the device will give the same effect on every region in the brain, but we can easily and very confidently say that the technology is scalable, and not only for mice. We can also make it bigger for eventual use in humans,” Dagdeviren says.

The researchers now plan to investigate how ultrasound stimulation might affect different regions of the brain, and if the devices can remain functional when implanted for year-long timescales. They are also interested in the possibility of incorporating a microfluidic channel, which could allow the device to deliver drugs as well as ultrasound.

In addition to holding promise as a potential therapeutic for Parkinson’s or other diseases, this type of ultrasound device could also be a valuable tool to help researchers learn more about the brain, the researchers say.

“Our goal to provide this as a research tool for the neuroscience community, because we believe that we don’t have enough effective tools to understand the brain,” Dagdeviren says. “As device engineers, we are trying to provide new tools so that we can learn more about different regions of the brain.”

The research was funded by the MIT Media Lab Consortium and the Brain and Behavior Foundation Research (BBRF) NARSAD Young Investigator Award.

Share this news article on:

Related links.

  • Canan Dagdeviren
  • Conformable Decoders Group
  • School of Architecture and Planning

Related Topics

  • Neuroscience
  • Brain and cognitive sciences
  • Medical devices
  • Parkinson's

Related Articles

A gloved hand holds a soft, flexible patch made of silicone. It has 5 square sensors positioned in a cross.

A new ultrasound patch can measure how full your bladder is

A blue glowing fiber in darkness. The fiber is held by finger and seems to light up with it touches another hand.

Soft optical fibers block pain while moving and stretching with the body

Standing before a green background, photo shows a woman's torso from shoulders to waist. She is wearing a white plastic meshlike device with honeycomb-shaped holes and small metallic parts over one breast. The device is attached to a black sports bra. In one hand she holds a green circuit board that hangs via thin, flat cable from the device.

A wearable ultrasound scanner could detect breast cancer earlier

Previous item Next item

More MIT News

A rendering shows the MIT campus and Cambridge, with MIT buildings in red.

Students research pathways for MIT to reach decarbonization goals

Read full story →

Namrata Kala sits in glass-walled building

Improving working environments amid environmental distress

Ashesh Rambachan converses with a student in the front of a classroom.

A data-driven approach to making better choices

On the left, Erik Lin-Greenberg talks, smiling, with two graduate students in his office. On the right, Tracy Slatyer sits with two students on a staircase, conversing warmly.

Paying it forward

Portrait photo of John Fucillo posing on a indoor stairwell

John Fucillo: Laying foundations for MIT’s Department of Biology

Graphic of hand holding a glowing chip-based 3D printer

Researchers demonstrate the first chip-based 3D printer

  • More news on MIT News homepage →

Massachusetts Institute of Technology 77 Massachusetts Avenue, Cambridge, MA, USA

  • Map (opens in new window)
  • Events (opens in new window)
  • People (opens in new window)
  • Careers (opens in new window)
  • Accessibility
  • Social Media Hub
  • MIT on Facebook
  • MIT on YouTube
  • MIT on Instagram

Illustration of a distant mountain with winding forest stream in the foreground

2024 Environmental Performance Index: A Surprise Top Ranking, Global Biodiversity Commitment Tested

The Baltic nation of Estonia is No. 1 in the 2024 rankings, while Denmark, one of the top ranked countries in the 2022 EPI dropped to 10 th place, highlighting the challenges of reducing emissions in hard-to-decarbonize industries. Meanwhile, “paper parks” are proving a global challenge to international biodiversity commitments.

  Listen to Article

In 2022, at the UN Biodiversity Conference, COP 15, in Montreal over 190 countries made what has been called “the biggest conservation commitment the world has ever seen.”  The Kunming-Montreal Global Biodiversity Framework called for the effective protection and management of 30% of the world’s terrestrial, inland water, and coastal and marine areas by the year 2030 — commonly known as the 30x30 target. While there has been progress toward reaching this ambitious goal of protecting 30% of land and seas on paper, just ahead of World Environment Day, the 2024 Environmental Performance Index (EPI) , an analysis by Yale researchers that provides a data-driven summary of the state of sustainability around the world, shows that in many cases such protections have failed to halt ecosystem loss or curtail environmentally destructive practices.

A new metric that assesses how well countries are protecting important ecosystems indicated that while nations have made progress in protecting land and seas, many of these areas are “paper parks” where commercial activities such as mining and trawling continue to occur — sometimes at a higher rate than in non-protected areas. The EPI analyses show that in 23 countries, more than 10% of the land protected is covered by croplands and buildings, and in 35 countries there is more fishing activity inside marine protected areas than outside. 

“Protected areas are failing to achieve their goals in different ways,” said Sebastián Block Munguía, a postdoctoral associate with the Yale Center for Environmental Law and Policy (YCELP) and the lead author of the report. “In Europe, destructive fishing is allowed inside marine protected areas, and a large fraction of the area protected in land is covered by croplands, not natural ecosystems. In many developing countries, even when destructive activities are not allowed in protected areas, shortages of funding and personnel make it difficult to enforce rules.”

The 2024 EPI, published by the Yale Center for Environmental Law and Policy and Columbia University’s Center for International Earth Science Information Network ranks 180 countries based on 58 performance indicators to track progress on mitigating climate change, promoting environmental health, and safeguarding ecosystem vitality. The data evaluates efforts by the nations to reach U.N. sustainability goals, the 2015 Paris Climate Change Agreement, as well as the Kunming-Montreal Global Biodiversity Framework. The data for the index underlying different indicators come from a variety of academic institutions and international organizations and cover different periods. Protected area coverage indicators are based on data from March 2024, while greenhouse emissions data are from 2022.

Estonia has decreased its GHG emissions by 59% compared to 1990. The energy sector will be the biggest contributor in reducing emissions in the coming years as we have an aim to produce 100% of our electricity consumption from renewables by 2030.”

The index found that many countries that were leading in sustainability goals have fallen behind or stalled, illustrating the challenges of reducing emissions in hard-to-decarbonize industries and resistant sectors such as agriculture. In several countries, recent drops in agricultural greenhouse gas emissions (GHG) have been the result of external circumstances, not policy. For example, in Albania, supply chain disruptions led to more expensive animal feed that resulted in a sharp reduction in cows and, consequentially, nitrous oxide and methane emissions.

Estonia leads this year’s rankings with a 40% drop in GHG emissions over the last decade, largely attributed to replacing dirty oil shale power plants with cleaner energy sources. The country is drafting a proposal to achieve by 2040 a CO2 neutral energy sector and a CO2 neutral public transport network in bigger cities.

“Estonia has decreased its GHG emissions by 59% compared to 1990. The energy sector will be the biggest contributor in reducing emissions in the coming years as we have an aim to produce 100% of our electricity consumption from renewables by 2030,” said Kristi Klaas, Estonia’s vice-minister for Green Transition. Klaas discussed some of the policies that led to the country's success as well as ongoing challenges, such as reducing emissions in the agriculture sector, at a webinar hosted by YCELP on June 3.  Dr. Abdullah Ali Abdullah Al-Amri, chairman of the Environment Authority of Oman, also joined the webinar to discuss efforts aimed at protecting the county's multiple ecosystems with rare biodiversity and endangered species, such as the Arabian oryx, and subspecies, such as the Arabian leopard. 

Satellite image of the New Haven area

Subscribe to “YSE 3”

Biweekly, we highlight three news and research stories about the work we’re doing at Yale School of the Environment.

 Denmark, the top ranked country in the 2022 EPI dropped to 10th place, as its pace of decarbonization slowed, highlighting that those early gains from implementing “low-hanging-fruit policies, such as switching to electricity generation from coal to natural gas and expanding renewable power generation are themselves insufficient,” the index notes. Emissions in the world’s largest economies such as the U.S. (which is ranked 34th) are falling too slowly or still rising — such as in China, Russia, and India, which is ranked 176th.

Over the last decade only five countries — Estonia, Finland, Greece, Timor-Leste, and the United Kingdom — have cut their GHG emissions over the last decade at the rate needed to reach net zero by 2050. Vietnam and other developing countries in Southeast and Southern Asia — such as Pakistan, Laos, Myanmar, and Bangladesh — are ranked the lowest, indicating the urgency of international cooperation to help provide a path for struggling nations to achieve sustainability.

“The 2024 Environmental Performance Index highlights a range of critical sustainability challenges from climate change to biodiversity loss and beyond — and reveals trends suggesting that countries across the world need to redouble their efforts to protect critical ecosystems and the vitality of our planet,” said Daniel Esty, Hillhouse Professor of Environmental Law and Policy and director of YCELP.

  • Daniel C. Esty
  • Sebastián Block Munguía
  • Ecosystem Management and Conservation
  • Environmental Policy Analysis

Media Contact

Portrait of Fran Silverman

Fran Silverman

Associate Director of Communications

Research in the News

A path in the Sinharaja rainforest in Sri Lanka

Climate Change Threatens Resilience of Sri Lankan Rainforests

 

An Inside Look at Beech Leaf Disease

An uncompleted construction project in India

Achieving Sustainable Urban Growth on a Global Scale

Connect with us.

  • Request Information
  • Register for Events

recent research papers in vlsi

An official website of the United States government

Here's how you know

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( Lock Locked padlock ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

fhfa's logo

Staff Working Papers Working Paper 24-03: The Lock-In Effect of Rising Mortgage Rates

​​​​​​Ross M. Batzer, Jonah R. Coste, William M. Doerner, and Michael J. Seiler

​​Ab​stract:

People can be “locked-in” or constrained in their ability to make appropriate financial changes, such as being unable to move homes, change jobs, sell stocks, rebalance portfolios, shift financial accounts, adjust insurance policies, transfer investment profits, or inherit wealth. These frictions—whether institutional, legislative, personal, or market-driven—are often overlooked. Residential real estate exemplifies this challenge with its physical immobility, high transaction costs, and concentrated wealth. In the United States, nearly all 50 million active mortgages have fixed rates, and most have interest rates far below prevailing market rates, creating a disincentive to sell. This paper finds that for every percentage point that market mortgage rates exceed the origination interest rate, the probability of sale is decreased by 18.1%. This mortgage rate lock-in led to a 57% reduction in home sales with fixed-rate mortgages in 2023Q4 and prevented 1.33 million sales between 2022Q2 and 2023Q4. The supply reduction increased home prices by 5.7%, outweighing the direct impact of elevated rates, which decreased prices by 3.3%. These findings underscore how mortgage rate lock-in restricts mobility, results in people not living in homes they would prefer, inflates prices, and worsens affordability. Certain borrower groups with lower wealth accumulation are less able to strategically time their sales, worsening inequality.​

​Mortgage lock-in data are available below in two formats at the bottom of this webpage. The first file offers a data supplement that could be used to recreate figures shown in the working paper. The second file offers additional developmental data aggregates produced from estimations in the working paper. Both files are subject to change with working paper revisions. Our  FA​Qs  address common questions about the datasets. Please cite this working paper when using either dataset.​

  • Supplemental data​​ for figures  (1 MB)
  • ​​​ Developmental data aggregates ​ (45 MB)​

To revisit this article, visit My Profile, then View saved stories .

  • Backchannel
  • Newsletters
  • WIRED Insider
  • WIRED Consulting

Will Knight

OpenAI Offers a Peek Inside the Guts of ChatGPT

Person using ChatGPT on a computer

ChatGPT developer OpenAI’s approach to building artificial intelligence came under fire this week from former employees who accuse the company of taking unnecessary risks with technology that could become harmful.

Today, OpenAI released a new research paper apparently aimed at showing it is serious about tackling AI risk by making its models more explainable. In the paper , researchers from the company lay out a way to peer inside the AI model that powers ChatGPT. They devise a method of identifying how the model stores certain concepts—including those that might cause an AI system to misbehave.

Although the research makes OpenAI’s work on keeping AI in check more visible, it also highlights recent turmoil at the company. The new research was performed by the recently disbanded “superalignment” team at OpenAI that was dedicated to studying the technology’s long-term risks.

The former group’s coleads, Ilya Sutskever and Jan Leike—both of whom have left OpenAI —are named as coauthors. Sutskever, a cofounder of OpenAI and formerly chief scientist, was among the board members who voted to fire CEO Sam Altman last November, triggering a chaotic few days that culminated in Altman’s return as leader.

ChatGPT is powered by a family of so-called large language models called GPT, based on an approach to machine learning known as artificial neural networks. These mathematical networks have shown great power to learn useful tasks by analyzing example data, but their workings cannot be easily scrutinized as conventional computer programs can. The complex interplay between the layers of “neurons” within an artificial neural network makes reverse engineering why a system like ChatGPT came up with a particular response hugely challenging.

“Unlike with most human creations, we don’t really understand the inner workings of neural networks,” the researchers behind the work wrote in an accompanying blog post . Some prominent AI researchers believe that the most powerful AI models, including ChatGPT, could perhaps be used to design chemical or biological weapons and coordinate cyberattacks. A longer-term concern is that AI models may choose to hide information or act in harmful ways in order to achieve their goals.

OpenAI’s new paper outlines a technique that lessens the mystery a little, by identifying patterns that represent specific concepts inside a machine learning system with help from an additional machine learning model. The key innovation is in refining the network used to peer inside the system of interest by identifying concepts, to make it more efficient.

OpenAI proved out the approach by identifying patterns that represent concepts inside GPT-4, one of its largest AI models. The company released code related to the interpretability work, as well as a visualization tool that can be used to see how words in different sentences activate concepts, including profanity and erotic content, in GPT-4 and another model. Knowing how a model represents certain concepts could be a step toward being able to dial down those associated with unwanted behavior, to keep an AI system on the rails. It could also make it possible to tune an AI system to favor certain topics or ideas.

The Snowflake Attack May Be Turning Into One of the Largest Data Breaches Ever

By Matt Burgess

Microsoft’s Recall Feature Is Even More Hackable Than You Thought

By Andy Greenberg

Microsoft Will Switch Off Recall by Default After Security Backlash

By Matthew Gault

Even though LLMs defy easy interrogation, a growing body of research suggests they can be poked and prodded in ways that reveal useful information. Anthropic, an OpenAI competitor backed by Amazon and Google, published similar work on AI interpretability last month. To demonstrate how the behavior of AI systems might be tuned, the company's researchers created a chatbot obsessed with San Francisco's Golden Gate Bridge . And simply asking an LLM to explain its reasoning can sometimes yield insights .

“It’s exciting progress,” says David Bau , a professor at Northeastern University who works on AI explainability, of the new OpenAI research. “As a field, we need to be learning how to understand and scrutinize these large models much better.”

Bau says the OpenAI team’s main innovation is in showing a more efficient way to configure a small neural network that can be used to understand the components of a larger one. But he also notes that the technique needs to be refined to make it more reliable. “There’s still a lot of work ahead in using these methods to create fully understandable explanations,” Bau says.

Bau is part of a US government-funded effort called the National Deep Inference Fabric , which will make cloud computing resources available to academic researchers so that they too can probe especially powerful AI models. “We need to figure out how we can enable scientists to do this work even if they are not working at these large companies,” he says.

OpenAI’s researchers acknowledge in their paper that further work needs to be done to improve their method, but also say they hope it will lead to practical ways to control AI models. “We hope that one day, interpretability can provide us with new ways to reason about model safety and robustness, and significantly increase our trust in powerful AI models by giving strong assurances about their behavior,” they write.

You Might Also Like …

Navigate election season with our WIRED Politics Lab newsletter and podcast

Don’t think breakdancing is an Olympic sport ? The world champ agrees (kinda)

How researchers cracked an 11-year-old password to a $3M crypto wallet

The uncanny rise of the world’s first AI beauty pageant

Give your back a break: Here are the best office chairs we’ve tested

recent research papers in vlsi

Steven Levy

Pocket-Sized AI Models Could Unlock a New Era of Computing

Paresh Dave

Google's AI Overviews Will Always Be Broken. That's How AI Works

IEEE Account

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

The state of AI in early 2024: Gen AI adoption spikes and starts to generate value

If 2023 was the year the world discovered generative AI (gen AI) , 2024 is the year organizations truly began using—and deriving business value from—this new technology. In the latest McKinsey Global Survey  on AI, 65 percent of respondents report that their organizations are regularly using gen AI, nearly double the percentage from our previous survey just ten months ago. Respondents’ expectations for gen AI’s impact remain as high as they were last year , with three-quarters predicting that gen AI will lead to significant or disruptive change in their industries in the years ahead.

About the authors

This article is a collaborative effort by Alex Singla , Alexander Sukharevsky , Lareina Yee , and Michael Chui , with Bryce Hall , representing views from QuantumBlack, AI by McKinsey, and McKinsey Digital.

Organizations are already seeing material benefits from gen AI use, reporting both cost decreases and revenue jumps in the business units deploying the technology. The survey also provides insights into the kinds of risks presented by gen AI—most notably, inaccuracy—as well as the emerging practices of top performers to mitigate those challenges and capture value.

AI adoption surges

Interest in generative AI has also brightened the spotlight on a broader set of AI capabilities. For the past six years, AI adoption by respondents’ organizations has hovered at about 50 percent. This year, the survey finds that adoption has jumped to 72 percent (Exhibit 1). And the interest is truly global in scope. Our 2023 survey found that AI adoption did not reach 66 percent in any region; however, this year more than two-thirds of respondents in nearly every region say their organizations are using AI. 1 Organizations based in Central and South America are the exception, with 58 percent of respondents working for organizations based in Central and South America reporting AI adoption. Looking by industry, the biggest increase in adoption can be found in professional services. 2 Includes respondents working for organizations focused on human resources, legal services, management consulting, market research, R&D, tax preparation, and training.

Also, responses suggest that companies are now using AI in more parts of the business. Half of respondents say their organizations have adopted AI in two or more business functions, up from less than a third of respondents in 2023 (Exhibit 2).

Gen AI adoption is most common in the functions where it can create the most value

Most respondents now report that their organizations—and they as individuals—are using gen AI. Sixty-five percent of respondents say their organizations are regularly using gen AI in at least one business function, up from one-third last year. The average organization using gen AI is doing so in two functions, most often in marketing and sales and in product and service development—two functions in which previous research  determined that gen AI adoption could generate the most value 3 “ The economic potential of generative AI: The next productivity frontier ,” McKinsey, June 14, 2023. —as well as in IT (Exhibit 3). The biggest increase from 2023 is found in marketing and sales, where reported adoption has more than doubled. Yet across functions, only two use cases, both within marketing and sales, are reported by 15 percent or more of respondents.

Gen AI also is weaving its way into respondents’ personal lives. Compared with 2023, respondents are much more likely to be using gen AI at work and even more likely to be using gen AI both at work and in their personal lives (Exhibit 4). The survey finds upticks in gen AI use across all regions, with the largest increases in Asia–Pacific and Greater China. Respondents at the highest seniority levels, meanwhile, show larger jumps in the use of gen Al tools for work and outside of work compared with their midlevel-management peers. Looking at specific industries, respondents working in energy and materials and in professional services report the largest increase in gen AI use.

Investments in gen AI and analytical AI are beginning to create value

The latest survey also shows how different industries are budgeting for gen AI. Responses suggest that, in many industries, organizations are about equally as likely to be investing more than 5 percent of their digital budgets in gen AI as they are in nongenerative, analytical-AI solutions (Exhibit 5). Yet in most industries, larger shares of respondents report that their organizations spend more than 20 percent on analytical AI than on gen AI. Looking ahead, most respondents—67 percent—expect their organizations to invest more in AI over the next three years.

Where are those investments paying off? For the first time, our latest survey explored the value created by gen AI use by business function. The function in which the largest share of respondents report seeing cost decreases is human resources. Respondents most commonly report meaningful revenue increases (of more than 5 percent) in supply chain and inventory management (Exhibit 6). For analytical AI, respondents most often report seeing cost benefits in service operations—in line with what we found last year —as well as meaningful revenue increases from AI use in marketing and sales.

Inaccuracy: The most recognized and experienced risk of gen AI use

As businesses begin to see the benefits of gen AI, they’re also recognizing the diverse risks associated with the technology. These can range from data management risks such as data privacy, bias, or intellectual property (IP) infringement to model management risks, which tend to focus on inaccurate output or lack of explainability. A third big risk category is security and incorrect use.

Respondents to the latest survey are more likely than they were last year to say their organizations consider inaccuracy and IP infringement to be relevant to their use of gen AI, and about half continue to view cybersecurity as a risk (Exhibit 7).

Conversely, respondents are less likely than they were last year to say their organizations consider workforce and labor displacement to be relevant risks and are not increasing efforts to mitigate them.

In fact, inaccuracy— which can affect use cases across the gen AI value chain , ranging from customer journeys and summarization to coding and creative content—is the only risk that respondents are significantly more likely than last year to say their organizations are actively working to mitigate.

Some organizations have already experienced negative consequences from the use of gen AI, with 44 percent of respondents saying their organizations have experienced at least one consequence (Exhibit 8). Respondents most often report inaccuracy as a risk that has affected their organizations, followed by cybersecurity and explainability.

Our previous research has found that there are several elements of governance that can help in scaling gen AI use responsibly, yet few respondents report having these risk-related practices in place. 4 “ Implementing generative AI with speed and safety ,” McKinsey Quarterly , March 13, 2024. For example, just 18 percent say their organizations have an enterprise-wide council or board with the authority to make decisions involving responsible AI governance, and only one-third say gen AI risk awareness and risk mitigation controls are required skill sets for technical talent.

Bringing gen AI capabilities to bear

The latest survey also sought to understand how, and how quickly, organizations are deploying these new gen AI tools. We have found three archetypes for implementing gen AI solutions : takers use off-the-shelf, publicly available solutions; shapers customize those tools with proprietary data and systems; and makers develop their own foundation models from scratch. 5 “ Technology’s generational moment with generative AI: A CIO and CTO guide ,” McKinsey, July 11, 2023. Across most industries, the survey results suggest that organizations are finding off-the-shelf offerings applicable to their business needs—though many are pursuing opportunities to customize models or even develop their own (Exhibit 9). About half of reported gen AI uses within respondents’ business functions are utilizing off-the-shelf, publicly available models or tools, with little or no customization. Respondents in energy and materials, technology, and media and telecommunications are more likely to report significant customization or tuning of publicly available models or developing their own proprietary models to address specific business needs.

Respondents most often report that their organizations required one to four months from the start of a project to put gen AI into production, though the time it takes varies by business function (Exhibit 10). It also depends upon the approach for acquiring those capabilities. Not surprisingly, reported uses of highly customized or proprietary models are 1.5 times more likely than off-the-shelf, publicly available models to take five months or more to implement.

Gen AI high performers are excelling despite facing challenges

Gen AI is a new technology, and organizations are still early in the journey of pursuing its opportunities and scaling it across functions. So it’s little surprise that only a small subset of respondents (46 out of 876) report that a meaningful share of their organizations’ EBIT can be attributed to their deployment of gen AI. Still, these gen AI leaders are worth examining closely. These, after all, are the early movers, who already attribute more than 10 percent of their organizations’ EBIT to their use of gen AI. Forty-two percent of these high performers say more than 20 percent of their EBIT is attributable to their use of nongenerative, analytical AI, and they span industries and regions—though most are at organizations with less than $1 billion in annual revenue. The AI-related practices at these organizations can offer guidance to those looking to create value from gen AI adoption at their own organizations.

To start, gen AI high performers are using gen AI in more business functions—an average of three functions, while others average two. They, like other organizations, are most likely to use gen AI in marketing and sales and product or service development, but they’re much more likely than others to use gen AI solutions in risk, legal, and compliance; in strategy and corporate finance; and in supply chain and inventory management. They’re more than three times as likely as others to be using gen AI in activities ranging from processing of accounting documents and risk assessment to R&D testing and pricing and promotions. While, overall, about half of reported gen AI applications within business functions are utilizing publicly available models or tools, gen AI high performers are less likely to use those off-the-shelf options than to either implement significantly customized versions of those tools or to develop their own proprietary foundation models.

What else are these high performers doing differently? For one thing, they are paying more attention to gen-AI-related risks. Perhaps because they are further along on their journeys, they are more likely than others to say their organizations have experienced every negative consequence from gen AI we asked about, from cybersecurity and personal privacy to explainability and IP infringement. Given that, they are more likely than others to report that their organizations consider those risks, as well as regulatory compliance, environmental impacts, and political stability, to be relevant to their gen AI use, and they say they take steps to mitigate more risks than others do.

Gen AI high performers are also much more likely to say their organizations follow a set of risk-related best practices (Exhibit 11). For example, they are nearly twice as likely as others to involve the legal function and embed risk reviews early on in the development of gen AI solutions—that is, to “ shift left .” They’re also much more likely than others to employ a wide range of other best practices, from strategy-related practices to those related to scaling.

In addition to experiencing the risks of gen AI adoption, high performers have encountered other challenges that can serve as warnings to others (Exhibit 12). Seventy percent say they have experienced difficulties with data, including defining processes for data governance, developing the ability to quickly integrate data into AI models, and an insufficient amount of training data, highlighting the essential role that data play in capturing value. High performers are also more likely than others to report experiencing challenges with their operating models, such as implementing agile ways of working and effective sprint performance management.

About the research

The online survey was in the field from February 22 to March 5, 2024, and garnered responses from 1,363 participants representing the full range of regions, industries, company sizes, functional specialties, and tenures. Of those respondents, 981 said their organizations had adopted AI in at least one business function, and 878 said their organizations were regularly using gen AI in at least one function. To adjust for differences in response rates, the data are weighted by the contribution of each respondent’s nation to global GDP.

Alex Singla and Alexander Sukharevsky  are global coleaders of QuantumBlack, AI by McKinsey, and senior partners in McKinsey’s Chicago and London offices, respectively; Lareina Yee  is a senior partner in the Bay Area office, where Michael Chui , a McKinsey Global Institute partner, is a partner; and Bryce Hall  is an associate partner in the Washington, DC, office.

They wish to thank Kaitlin Noe, Larry Kanter, Mallika Jhamb, and Shinjini Srivastava for their contributions to this work.

This article was edited by Heather Hanselman, a senior editor in McKinsey’s Atlanta office.

Explore a career with us

Related articles.

One large blue ball in mid air above many smaller blue, green, purple and white balls

Moving past gen AI’s honeymoon phase: Seven hard truths for CIOs to get from pilot to scale

A thumb and an index finger form a circular void, resembling the shape of a light bulb but without the glass component. Inside this empty space, a bright filament and the gleaming metal base of the light bulb are visible.

A generative AI reset: Rewiring to turn potential into value in 2024

High-tech bees buzz with purpose, meticulously arranging digital hexagonal cylinders into a precisely stacked formation.

Implementing generative AI with speed and safety

An Analysis of Pandemic-Era Inflation in 11 Economies

In a collaborative project with ten central banks, we have investigated the causes of the post-pandemic global inflation, building on our earlier work for the United States. Globally, as in the United States, pandemic-era inflation was due primarily to supply disruptions and sharp increases in the prices of food and energy; however, and in sharp contrast to the 1970s, the inflationary effects of these supply shocks have not been persistent, in part due to the credibility of central bank inflation targets. As the effects of supply shocks have subsided, tight labor markets, and the rises in nominal wages, have become relatively more important sources of inflation in many countries. In several countries, including the United States, curbing wage inflation and returning price inflation to target may require a period of modestly higher unemployment.

We thank the Peterson Institute for International Economics and the Hutchins Center for Fiscal and Monetary Policy at the Brookings Institution for research support. The views expressed herein are those of the authors and do not necessarily reflect the views of the National Bureau of Economic Research.

MARC RIS BibTeΧ

Download Citation Data

  • replication package

More from NBER

In addition to working papers , the NBER disseminates affiliates’ latest findings through a range of free periodicals — the NBER Reporter , the NBER Digest , the Bulletin on Retirement and Disability , the Bulletin on Health , and the Bulletin on Entrepreneurship  — as well as online conference reports , video lectures , and interviews .

15th Annual Feldstein Lecture, Mario Draghi, "The Next Flight of the Bumblebee: The Path to Common Fiscal Policy in the Eurozone cover slide

COMMENTS

  1. VLSI for Next Generation CE

    The current research in VLSI explores emerging trends and novel ideas and concepts covering a broad range of topics in the area of VLSI: from VLSI circuits, systems, and design methods, to system-level design and systemon- chip issues, to bringing VLSI methods to new areas and technologies such as nano and molecular devices, MEMS, and quantum computing. Future design methodologies are also key ...

  2. Current issues and emerging techniques for VLSI testing

    The development of complementary metal-oxide-semiconductor (CMOS) technology brought about a new paradigm for low-power circuit design. For the implementation of digital circuits with very large-scale integration, CMOS design styles are frequently employed in VLSI. There are billions of transistors on a single die in today's IC devices.

  3. Recent Trends in Novel Semiconductor Devices

    The VLSI industry has grown a lot for several decades. The Packing density of integrated circuits has been increased without compromising the functionality. Scaling of semiconductor devices, improvements in process technology and the development of new device designs are the key to this. Starting from the planar MOSFETs to novel multigate transistors, semiconductor devices have a history of ...

  4. AI/ML algorithms and applications in VLSI design and technology

    This work thoroughly attempts to summarize the literature on AI/ML algorithms for VLSI design and modeling at different abstraction levels. It is the first paper that provides a detailed review encompassing circuit modeling to system-on-chip (SoC) design, along with physical design, testing, and manufacturing.

  5. 68784 PDFs

    Explore the latest full-text research PDFs, articles, conference papers, preprints and more on VLSI TECHNOLOGY. Find methods information, sources, references or conduct a literature review on VLSI ...

  6. Implementation of AI in the field of VLSI: A Review

    With this in mind, an extensive review has been conducted on various aspects of AI in the field of VLSI. This paper throws light on how AI has marked its way on various subfields of VLSI, namely, analog, digital and physical design. We have also taken into account the recent machine learning and deep learning techniques incorporated in VLSI.

  7. PDF In Proc. 2021 IEEE Symposium on VLSI Technology (in the press); https

    Co-packaged transceivers speed up. Integrated optical input-output technologies are promising for high-speed communications because of their scaling and bandwidth advantages compared with ...

  8. Progress of Placement Optimization for Accelerating VLSI Physical Design

    Placement is essential in very large-scale integration (VLSI) physical design, as it directly affects the design cycle. Despite extensive prior research on placement, achieving fast and efficient placement remains challenging because of the increasing design complexity. In this paper, we comprehensively review the progress of placement optimization from the perspective of accelerating VLSI ...

  9. Intel Labs Presents Research on New Power Efficiency Techniques at

    The 2021 Symposia on VLSI Technology and Circuits, a two-track conference including the VLSI Circuit Symposium and the VLSI Technology Symposium, run from June 13-19, 2021. Intel researchers will present their latest research on breakthroughs in power efficiencies enabled by new materials and circuits in silicon at the VLSI Circuits Symposium.

  10. Implementation of Machine Learning in VLSI Integrated ...

    Machine learning has made an impact on the area attributed to microchip, and it is initially used in automation. These techniques will eventually supplant the current VLSI design concept. Design creation has been automated by substituting time-consuming traditional concepts developed by experts. This development could result in a tremendous change in the realm of hardware computation and AI ...

  11. VLSI-SoC: Design Trends

    The VLSI-SoC 2020 proceedings present cutting-edge research on very large scale integration, low-power design of RF, and more. VLSI-SoC: Design Trends: 28th IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2020, Salt Lake City, UT, USA, October 6-9, 2020, Revised and Extended Selected Papers | SpringerLink

  12. AI/ML Algorithms and Applications in VLSI Design and Technology

    D Amuru et al.: PagePreprint submitted to Elsevier 1 of 41 AI/ML Algorithms and Applications in VLSI Design and Technology Deepthi Amurua,∗, aHarsha V. Vudumula , aPavan K Cherupallya, Sushanth R Gurram , Amir Ahmadb, Andleeb Zahraa and Zia Abbasa aCenter for VLSI and Embedded Systems Technology (CVEST), International Institute of Information Technology, Hyderabad (IIIT-H),

  13. Intel Debuts Intel 4 Technologies Among 13 Papers at the 2022 VLSI

    The 2022 IEEE Symposium on VLSI Technology and Circuits will run from June 13-17th in Honolulu, HI, and offer limited access to conference content on-demand. Researchers present 13 papers, including results of a new advanced CMOS FinFET technology, Intel 4, demonstrating more than 20% performance gain at iso-power over Intel 7.

  14. Electronics

    A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications. Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the ...

  15. Electronics

    The focus of this Special Issue is on the research challenges related to the design of emerging microelectronics and VLSI circuits and systems that meet the demanding specifications of innovative applications. This Special Issue considers challenges in the fields of low power consumption, small integration area, testing and security, without ...

  16. vlsi testing Latest Research Papers

    VLSI Testing is one of the essential domains in recent times. With the channel length of the transistor decreasing continually, the number of transistors in a chip increases, thus increasing the probability of defects or faults. Automatic Test Pattern Generator is one way to find such input test vectors to the circuit, which will help identify ...

  17. High-Performance VLSI Architectures for Artificial Intelligence and

    research, explains its goals, and emphasizes the importance of solving them to advance the state-of-the-art VLSI architectures for AI and ML applications. Despite the growth of research activities in this area, there are still significant gaps in the VLSI architectures for AI and ML applications. First, current VLSI architectures

  18. A Comprehensive Review of Machine Learning Applications in VLSI Testing

    The semiconductor industry's relentless pursuit of increased device integration and performance has propelled Very Large-Scale Integration (VLSI) technology into an era of unparalleled complexity. Ensuring the reliability and quality of VLSI devices demands innovative solutions, and machine learning has emerged as a transformative force in this domain. This comprehensive review paper delves ...

  19. Latest Research topics in vlsi design

    latest research topics in vlsi design. latest research topics in vlsi design - Doctor of philosophy is the final degree in any area. It requires a lot of efforts and hard work to achieve this.It starts with selection of a topic which should be recent and lies in your area of interest. If we talk specifically about research in technology then ...

  20. Browse journals and books

    Browse Calls for Papers beta. Browse 5,060 journals and 35,600 books. A; A Review on Diverse Neurological Disorders. ... The Nuclear Research Foundation School Certificate Integrated, Volume 1. Book • 1966. ... Case Study-Based Assessment of Current Experience, Cross-sectorial Effects, and Socioeconomic Transformations. Book

  21. ASU hosts 2024 IEEE VLSI Test Symposium

    Testing reception to new VLSI test ideas. Attendees from industry, government institutions and universities from around the world network with other microelectronics testing experts, showcase their discoveries in research papers published as part of the event proceedings, listen to keynote speeches and exchange ideas in various aspects of testing.

  22. Ultrasound offers a new way to perform deep brain stimulation

    The new approach uses ultrasound delivered by a fiber about the thickness of a human hair. ... MIT graduate student Jason Hou and MIT postdoc Md Osman Goni Nayeem are the lead authors of the paper, along with collaborators from MIT's McGovern Institute for Brain Research, Boston University, and Caltech.

  23. 2024 Environmental Performance Index: A Surprise Top Ranking, Global

    203-436-4842. The Baltic nation of Estonia is No. 1 in the 2024 rankings, while Denmark, one of the top ranked countries in the 2022 EPI dropped to 10th place, highlighting the challenges of reducing emissions in hard-to-decarbonize industries. Meanwhile, "paper parks" are proving a global challenge to international biodiversity commitments.

  24. Working Paper 24-03: The Lock-In Effect of Rising Mortgage Rates

    This paper finds that for every percentage point that market mortgage rates exceed the origination interest rate, the probability of sale is decreased by 18.1%. This mortgage rate lock-in led to a 57% reduction in home sales with fixed-rate mortgages in 2023Q4 and prevented 1.33 million sales between 2022Q2 and 2023Q4.

  25. OpenAI Offers a Peek Inside the Guts of ChatGPT

    Today, OpenAI released a new research paper apparently aimed at showing it is serious about tackling AI risk by making its models more explainable. In the paper, researchers from the company lay ...

  26. Advanced CMOS VLSI Technology for Low Power Analog System Design with

    This research article provides an insight about the important challenges involved in the low power analog system design using advanced CMOS VLSI approach. Reduction in the dimension of MOS base channel and reduction in gate oxide results in greater advancement in terms of area of the chip, operating speed, and reduction of power consumption (mainly in digital components). In other words, few ...

  27. New Paper Redefining Characteristics of Lightning-Initiated ...

    Twenty-six years of lightning data were paired with over 68,000 LIW reports to understand lightning flash characteristics responsible for ignition in between 1995 and 2020. Results indicate that 92% of LIW were started by negative cloud-to-ground (CG) lightning flashes and 57% were single stroke flashes. Moreover, 62% of LIW reports did not ...

  28. The state of AI in early 2024: Gen AI adoption spikes and starts to

    If 2023 was the year the world discovered generative AI (gen AI), 2024 is the year organizations truly began using—and deriving business value from—this new technology.In the latest McKinsey Global Survey on AI, 65 percent of respondents report that their organizations are regularly using gen AI, nearly double the percentage from our previous survey just ten months ago.

  29. An Analysis of Pandemic-Era Inflation in 11 Economies

    Issue Date May 2024. In a collaborative project with ten central banks, we have investigated the causes of the post-pandemic global inflation, building on our earlier work for the United States. Globally, as in the United States, pandemic-era inflation was due primarily to supply disruptions and sharp increases in the prices of food and energy ...