We’ve felt a nice surge of energy over the past month, as many of our authors have transitioned from summer mode to fall, with a renewed emphasis on learning, experimenting, and launching new projects.
We published many more great articles in September than we could ever highlight here, but we still wanted to make sure you didn’t miss some of our recent highlights. Below are ten articles that resonated strongly with our community, whether through the large number of readers they attracted, the lively conversations they inspired, or the trending topics they have addressed. We are sure you will enjoy exploring them.
- New ChatGPT prompt engineering technique: program simulation
It’s pretty rare for an author’s TDS debut to become one of the most popular articles of the month, but Giuseppe Scalamogna‘s article achieved this feat with an accessible and timely explainer on program simulation: a rapid engineering technique that “aims to make ChatGPT work in a way that simulates a program” and can lead to impressive results .
- How to program a neural network
Neural network tutorials are easy to find. Less common? A step-by-step guide that helps readers gain an intuitive understanding of how they work, And the practical know-how to code them from scratch. Callum Bruce This is precisely what he delivered in his latest contribution.
- Don’t Start Your Data Science Journey Without These 5 Must-Have Steps – Complete Spotify Data Scientist Guide
If you have already discovered Khouloud El AlamiIn his writing, you won’t be surprised to learn that his most recent article offers actionable insights presented in an accessible and engaging manner. This one is aimed at data scientists at the start of their career: if you don’t know how to get on the right track, Khouloud’s advice will help you find your way.
- How to Design a Roadmap for a Machine Learning Project
For those of you who are already well into your ML journey, Heather CoutureThe new article offers a useful framework for streamlining the design of your next project. From in-depth document analysis to post-deployment maintenance, it covers all the bases of a successful iterative workflow.
- The public perception problem with machine learning
In a thought-provoking reflection, Stephanie Kirmer addresses a fundamental tension in current debates around AI: “all of our work in the service of building more and more advanced machine learning is limited in its possibilities not by the number of GPUs on which we can put the hand but by our ability to explain what AI is. we build and educate the public on what it means and how to use it.
- How to create an LLM from scratch
Drawing inspiration from the development process of models like GPT-3 and Falcon, Shawhin Talebi reviews the key aspects of creating a foundation LLM. Even if you don’t plan to train the next llama any time soon, it’s helpful to understand the practical considerations that come into play in such a large undertaking.
- Your own personal ChatGPT
If you are in the mood to create and tinker with language models, however, a good place to start is Robert A. GonsalvesA detailed overview of what it takes to fine-tune OpenAI’s GPT-3.5 Turbo model to perform new tasks using your own custom data.
- How to build a multi-GPU system for Deep Learning in 2023
Don’t roll up your sleeves yet: one of our most read tutorials in September, by Antonis Makropoulosfocuses on deep learning hardware and infrastructure and takes us through the nitty-gritty details of choosing the right components for your project needs.
- Meta-heuristics explained: ant colony optimization
For a more theoretical but no less fascinating subject, Hennie de Harder‘s introduction to ant colony optimization draws our attention to a “lesser-known gem” of an algorithm, explores how it was inspired by the ingenious foraging behaviors of ants, and reveals its inner workings. (In a follow-up articleHennie also demonstrates how he can solve real-world problems.)
- Falcon 180B: Can it run on your computer?
Ending on an ambitious note, Benjamin-Marie aims to find out if one can run the (very, very large) Falcon 180B model on consumer hardware. (Spoiler alert: Yes, with a few caveats.) This is a valuable resource for anyone weighing the pros and cons of working on a local machine versus using cloud services, especially now that more and more open source LLMs are coming onto the scene.
Our latest cohort of new authors
Every month, we’re excited to see a new group of authors join TDS, each sharing their own voice, knowledge, and experience with our community. If you’re looking for new writers to explore and follow, simply browse the work of our latest additions, including Rahul Nayak, Christian Burke, Aïcha Bokbot, Jason Vega, Giuseppe Scalamogna, Masatake Hirono, Shachaf Poran, Aris Tsakpinis, Nicolas Granieri, Lazare Kolebka, Ninad Sohoni, Mina Ghashami, Carl Bettosi, Dominika Woszczyk, James Koh, Ph.D., Tom Corbin, Antonio Jiménez Caballero, Gijs van den Dool, Ramkumar K., Milan Janosov, Luc Zaruba, Sohrab Sani, James Hamilton, Ilija Lazarević, Josh Poduska, Antonis Makropoulos, Yuichi Inoue, George Stavrakis, Yunzhe Wang, Anjan Biswas, Jared M. Maruskin, PhD, Michael Roizner, Alana Rister, Ph.D., Damien Gil, Shafquat Arefeen, Dmitry Kazhdan, Ryan PegoudAnd Robert Martin-Short.
Thank you for supporting the work of our authors! If you enjoy the articles you read on TDS, consider become a Medium member – this unlocks our entire archive (and all other articles on Medium as well).
Until the next variable,
Quick engineering tips, how-to guide to neural networks, and other recent must-reads was originally published in Towards data scienceon Medium, where people are continuing the conversation by highlighting and responding to this story.