Open Access
31 March 2024 Photonics for Energy in the Age of AI
Author Affiliations +
Abstract

JPE Editor-in-Chief Sean Shaheen remarks on the growth of generative AI and its potential for advancing scientific knowledge and discovery.

The growth in generative artificial intelligence (AI), particularly large language models (LLMs), in the last several years has been unparalleled, with numerous LLM systems going through step-edge increases in capabilities and number of users worldwide. The pace of progress appears to be only accelerating, with LLMs such OpenAI’s ChatGPT, Google’s Gemini, Meta’s Llama, Anthropic’s Claude, Mistral’s Large, and xAI’s Grok, to name a few, competing for top scores in a variety of benchmarks.1,2 Fundamental research in deep learning with origins in parallel distributed processing has been ongoing for decades,3,4 but the recent advances in LLM capabilities have seemingly crossed a threshold such that AI can now potentially help us in nearly every aspect of our work and, perhaps soon, daily lives. We now find ourselves in new terrain in trying to understand, explore, and benefit from the realm of possible use cases of AI.

As one example from an academic standpoint, there are immediate implications of AI for teaching, as AI could be used as a “force multiplier” for instructors who use it judiciously and following evidence-based practices.5 More broadly, AI is emerging as a form of co-intelligence that can function as a thinking companion to augment our cognitive capabilities and productivity.6 At minimum, even a brief conversation with a generative AI about a topic can often function as an “intuition pump”7 — one that is instantly available and never tires. What paradigm shifts will arise in our ways of thinking, working, and interacting with each other are hard to fully conceive of at present, but it is a good guess that they will be significant if not transformational.

In the realm of scientific knowledge and discovery, early evidence suggests that LLMs can demonstrate expert-level competence in conceptual physics reasoning,8 and we can expect continual improvements in the future with new releases of existing LLMs and, one can speculate, entirely new ones tailored toward scientific understanding and discovery. AI is already accelerating scientific research by helping to simulate complex concepts, design experiments, and make predictions about the outcomes.9 Researchers are now combining multiple LLM agents to design and execute scientific experiments for instance in chemical synthesis.10,11 Recent work has shown progress in predicting the physical property of materials from their crystal structures, with more efficient neural network architectures and algorithms than previously used.12

At JPE we invite you to use AI to super-charge your work and to accelerate advances in our field, while following the ethical guidelines provided in our author guidelines.13 Many use-cases of LLMs and generative AI can be envisioned, for instance in the design of materials and device structures, for the acceleration and deepening of data analysis, and as the aforementioned thinking companion to better interpret the results and, perhaps, to help conceive of and design the experiment in the first place. Studies of new computing paradigms, for instance in probabilistic computing for combinatorial optimization,14 which provide pathways to energy-efficient hardware implementations of neuronal computing, are also strongly encouraged.

For the record, no AI was used in the generation of this editorial, and any errors in content or delivery are strictly due to human error of the author!

References

1. 

Vellum, “LLM Leaderboard,” https://www.vellum.ai/llm-leaderboard (). Google Scholar

2. 

Hugging Face, Inc., “LLM Directory,” https://llm.extractum.io (). Google Scholar

3. 

Y. Cao et al., “A comprehensive survey of AI-Generated Content (AIGC): a history of generative AI from GAN to ChatGPT,” (2023). Google Scholar

4. 

M. R. Douglas, “Large language models,” (2023). Google Scholar

5. 

E. R. Mollick and L. Mollick, “Using AI to implement effective teaching strategies in classrooms: five strategies, including prompts,” https://ssrn.com/abstract=4391243 (). Google Scholar

6. 

E. R. Mollick, Co-Intelligence: Living and Working with AI, Penguin Random House( (2024). Google Scholar

7. 

D. C. Dennett, Intuition Pumps and other Tools for Thinking, W. W. Norton & Company( (2014). Google Scholar

8. 

C. G. West, “Advances in apparent conceptual physics reasoning in GPT-4,” (2023). Google Scholar

10. 

D. A. Boiko et al., “Emergent autonomous scientific research capabilities of large language models,” (2023). Google Scholar

11. 

A. M. Bran et al., “ChemCrow: augmenting large-language models with chemistry tools,” (2023). Google Scholar

12. 

T. Taniai et al., “Crystalformer: infinitely connected attention for periodic structure encoding,” (2024). Google Scholar

14. 

J. Zhu, Z. Xie and P. Bermel, “Numerical simulation of probabilistic computing to NP-complete number theory problems,” J. Photonics Energy, 13 (2), 028501 https://doi.org/10.1117/1.JPE.13.028501 (2023). Google Scholar
© 2024 Society of Photo-Optical Instrumentation Engineers (SPIE)
Sean Shaheen "Photonics for Energy in the Age of AI," Journal of Photonics for Energy 14(1), 010101 (31 March 2024). https://doi.org/10.1117/1.JPE.14.010101
Published: 31 March 2024
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Artificial intelligence

Design

Photonics

Astatine

Materials properties

Neural networks

Parallel computing

RELATED CONTENT


Back to Top