Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: pub@towardsai.net
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

Publication

GPT-4.5: The Next Evolution in AI
Data Analysis   Data Science   Latest   Machine Learning

GPT-4.5: The Next Evolution in AI

Last Updated on March 4, 2025 by Editorial Team

Author(s): Naveen Krishnan

Originally published on Towards AI.

Last week, I shared my thoughts on phi‑4 models and their innovative multimodal approach. Today, I’m thrilled to write about GPT‑4.5 — a model that not only pushes the boundaries of conversational AI but also makes it easier for developers to integrate powerful language capabilities into their apps via Azure OpenAI and Foundry. Grab your favorite beverage ☕, settle in, and let’s explore how GPT‑4.5 is set to transform our interactions with technology!

Image Source: OpenAI News | OpenAI

From GPT‑4 to GPT‑4.5: A Quick Evolutionary Recap 🔍

GPT‑4 paved the way for richer, more nuanced conversations. With GPT‑4.5, OpenAI has fine‑tuned the art of understanding context and generating responses that are even more human-like. Improvements in efficiency, contextual awareness, and multimodal integration mean that whether you’re building chatbots, content generators, or analytical tools, GPT‑4.5 can handle your toughest challenges.

But the real magic happens when you combine GPT‑4.5 with the robust enterprise-grade capabilities of Azure OpenAI Service — and then manage everything seamlessly using Azure AI Foundry. The result? A platform that’s both flexible and scalable for modern app development. ✨

Key Features of GPT‑4.5 💡

  • Enhanced Conversational Depth: GPT‑4.5 can maintain context over longer conversations, delivering responses that feel more intuitive and relevant.
  • Improved Accuracy & Efficiency: Faster processing means you get your answers almost in real time without sacrificing quality.
  • Humanized Output: With its refined tone and style, GPT‑4.5’s responses feel less mechanical and more like chatting with an insightful friend.
  • Seamless Multimodal Integration: Whether you’re feeding text, images, or data from various sources, GPT‑4.5 adapts and responds with finesse.
  • Enterprise‑Grade Integration: Through Azure OpenAI and Foundry, GPT‑4.5 becomes a part of a secure, scalable, and fully managed ecosystem ideal for production environments.

Why Azure OpenAI with Foundry? 🔗

Integrating GPT‑4.5 via Azure OpenAI Service offers several advantages:

  • Security & Compliance: Azure ensures your data is handled in compliance with industry standards (GDPR, HIPAA, etc.).
  • Scalability: Whether you’re a startup or an enterprise, Azure’s infrastructure scales with your needs.
  • Unified Management: Azure AI Foundry simplifies the management of models, data sources, and endpoints.
  • Easy Integration: With robust SDKs and clear sample code, you can quickly incorporate GPT‑4.5 into your applications.

In the sections below, I’ll walk you through sample code that demonstrates how to invoke GPT‑4.5 using Azure OpenAI and Foundry — across multiple languages so you can pick the one that fits your project best. Let’s get coding! 🚀

Invoking GPT‑4.5 via Azure OpenAI Using Foundry

Setting the Stage: Environment & Authentication

Before diving into the code, ensure you have the following prerequisites:

  • An Azure OpenAI Service resource with GPT‑4.5 available in your subscription.
  • Access to Azure AI Foundry, which helps manage and connect your models.
Image Source: Screenshot by User
  • Appropriate credentials (API keys or managed identities) stored securely (e.g., in environment variables or Azure Key Vault).

Below, you’ll find sample code in C# (.NET) and Python. These examples assume you have set environment variables like AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY, and AZURE_OPENAI_DEPLOYMENT_NAME. Adjust these as needed!

Sample Code in C# (.NET)

Below is a sample console application written in C# that initializes the Azure OpenAI client, configures the Foundry connection, and sends a request to GPT‑4.5.

using System;
using Azure;
using Azure.AI.OpenAI;
using System.Collections.Generic;

namespace GPT45Demo
{
class Program
{
static void Main(string[] args)
{
// Load configuration from environment variables
string endpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT");
string apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string deploymentName = Environment.GetEnvironmentVariable("AZURE_OPENAI_DEPLOYMENT_NAME");
// Initialize the Azure OpenAI client using Foundry integration settings
OpenAIClient client = new OpenAIClient(new Uri(endpoint), new AzureKeyCredential(apiKey));
// Create a system prompt to guide GPT-4.5's responses
string systemPrompt = "You are a knowledgeable assistant with deep insights on a range of topics. Please respond in a friendly and engaging manner, using emojis where appropriate. 😊";
// Build conversation history – you could extend this to include previous interactions
List<ChatMessage> messages = new List<ChatMessage>
{
new ChatMessage(ChatRole.System, systemPrompt),
new ChatMessage(ChatRole.User, "Can you show me how to invoke GPT-4.5 using Azure OpenAI with Foundry integration?")
};
// Create chat completion options
ChatCompletionsOptions options = new ChatCompletionsOptions
{
MaxTokens = 500,
Temperature = 0.7f,
// Setting the deployment name from environment variables ensures we are using our GPT-4.5 model
DeploymentName = deploymentName
};
// Add our conversation messages
foreach (var msg in messages)
{
options.Messages.Add(msg);
}
// Send the request and receive the response
ChatCompletions response = client.GetChatCompletions(options);
// Print the first completion result
Console.WriteLine("Response from GPT-4.5:");
Console.WriteLine(response.Choices[0].Message.Content);
}
}
}

Explanation:

  • We begin by loading our endpoint, API key, and deployment name from environment variables for secure configuration.
  • A system prompt is defined to ensure GPT‑4.5 understands the tone and style expected.
  • The conversation is built as a list of messages (system + user), which is then sent using the Azure OpenAI client.
  • Finally, we print out the response — this is the core of our Foundry integration, which helps manage model settings and authentication.

Sample Code in Python

Here’s a Python example using the OpenAI library (configured for Azure OpenAI) to invoke GPT‑4.5 with Foundry integration.

import os
import openai

# Load environment variables (ensure these are set securely)
azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")
azure_api_key = os.getenv("AZURE_OPENAI_API_KEY")
deployment_name = os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME")
# Configure the OpenAI client to use Azure OpenAI
openai.api_base = azure_endpoint
openai.api_key = azure_api_key
openai.api_version = "2024-10-21" # Adjust API version as needed
# Define a system prompt for context
system_message = (
"You are a friendly and insightful assistant. Please provide detailed and engaging responses, "
"using emojis and human-like language when appropriate. 😊"
)
# Prepare the conversation messages
messages = [
{"role": "system", "content": system_message},
{"role": "user", "content": "Show me an example of invoking GPT-4.5 via Azure OpenAI with Foundry integration."}
]
# Create a chat completion request
response = openai.ChatCompletion.create(
model=deployment_name, # This corresponds to the GPT-4.5 deployment in your Azure resource
messages=messages,
max_tokens=500,
temperature=0.7
)
# Print the generated response
print("Response from GPT-4.5:")
print(response.choices[0].message.content)

Explanation:

  • Environment variables are loaded using os.getenv() for secure configuration.
  • The openai module is configured to point to your Azure endpoint and use your API key.
  • We construct a conversation with both a system prompt and a user prompt.
  • The ChatCompletion.create() method sends our request to the GPT‑4.5 model deployed via Azure OpenAI (managed by Foundry).
  • Finally, we print the response. This code is ideal for rapid prototyping or integration within larger Python-based applications.

Integrating with Azure AI Foundry: Best Practices & Tips 🔧💡

1. Secure Your Keys:
Always store your API keys and sensitive configuration data using environment variables or secure vaults (like Azure Key Vault). Avoid hard‑coding secrets in your source code. 🔐

2. Manage Conversation History:
For a richer dialogue, store past conversation turns (system, user, assistant) and pass them in your request. This context allows GPT‑4.5 to generate responses that consider previous interactions. However, be mindful of token limits! 📜

3. Customize Your Prompts:
Experiment with the system prompt to adjust tone and response style. GPT‑4.5’s output can be tailored to different audiences — whether formal, casual, or fun. Emojis, as you’ve seen, add that extra human touch. 😄

4. Monitor & Optimize:
Use telemetry and logging (via Azure Monitor or Application Insights) to track response times, errors, and user interactions. This helps fine‑tune both your prompts and integration code. 📊

5. Leverage Foundry’s Ecosystem:
Azure AI Foundry not only simplifies model deployment and connection management but also allows you to integrate additional data sources (like Azure Cognitive Search) to augment GPT‑4.5’s responses. This can be especially powerful for creating context‑aware, retrieval‑augmented generation (RAG) pipelines. 🔄

Deep Dive: Invoking GPT‑4.5 in a Production Environment

When deploying GPT‑4.5 in production, consider the following additional points:

  • Token Management:
    Always monitor token usage to avoid unexpected costs and performance bottlenecks. Limit the conversation history to the most relevant messages.
  • Error Handling:
    Implement robust error handling for timeouts, API errors, and connectivity issues. Both the .NET and Python examples above include basic structures that you can expand upon for production readiness.
  • Scalability:
    With Azure’s scalable infrastructure, you can handle high volumes of requests. Integrate auto‑scaling and load balancing to maintain performance as demand grows.
  • Customization:
    Use Foundry’s configuration capabilities to customize deployment parameters, API versions, and even UI elements if you’re building a web app interface on top of GPT‑4.5.
  • Feedback & Iteration:
    Collect user feedback (via built‑in UI elements or logging) to iterate on prompts and system settings. This continuous improvement loop is essential for maintaining a high‑quality user experience.

Conclusion: Embrace the Future with GPT‑4.5 🌟

GPT‑4.5 represents a significant leap forward in the realm of conversational AI — merging technical excellence with a more natural, humanized interaction style. With seamless integration into the Azure OpenAI ecosystem and the powerful management capabilities of Azure AI Foundry, developers can now build applications that are not only more intelligent but also easier to manage and scale.

Whether you’re working in .NET, Python, or another language, the sample code above should serve as a helpful starting point. Experiment with different prompts, tweak your system messages, and harness the power of GPT‑4.5 to transform your applications.

I hope you found this deep dive both informative and inspiring. Drop your thoughts, questions, or feedback in the comments below, and let’s continue the conversation! 🚀💬

Happy coding and stay curious! 😄👍

Additional Resources:

Thank You!

Thanks for taking the time to read my story! If you enjoyed it and found it valuable, please consider giving it a clap (or 50!) to show your support. Your claps help others discover this content and motivate me to keep creating more.

Also, don’t forget to follow me for more insights and updates on AI. Your support means a lot and helps me continue sharing valuable content with you. Thank you!

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓

Sign Up for the Course
`; } else { console.error('Element with id="subscribe" not found within the page with class "home".'); } } }); // Remove duplicate text from articles /* Backup: 09/11/24 function removeDuplicateText() { const elements = document.querySelectorAll('h1, h2, h3, h4, h5, strong'); // Select the desired elements const seenTexts = new Set(); // A set to keep track of seen texts const tagCounters = {}; // Object to track instances of each tag elements.forEach(el => { const tagName = el.tagName.toLowerCase(); // Get the tag name (e.g., 'h1', 'h2', etc.) // Initialize a counter for each tag if not already done if (!tagCounters[tagName]) { tagCounters[tagName] = 0; } // Only process the first 10 elements of each tag type if (tagCounters[tagName] >= 2) { return; // Skip if the number of elements exceeds 10 } const text = el.textContent.trim(); // Get the text content const words = text.split(/\s+/); // Split the text into words if (words.length >= 4) { // Ensure at least 4 words const significantPart = words.slice(0, 5).join(' '); // Get first 5 words for matching // Check if the text (not the tag) has been seen before if (seenTexts.has(significantPart)) { // console.log('Duplicate found, removing:', el); // Log duplicate el.remove(); // Remove duplicate element } else { seenTexts.add(significantPart); // Add the text to the set } } tagCounters[tagName]++; // Increment the counter for this tag }); } removeDuplicateText(); */ // Remove duplicate text from articles function removeDuplicateText() { const elements = document.querySelectorAll('h1, h2, h3, h4, h5, strong'); // Select the desired elements const seenTexts = new Set(); // A set to keep track of seen texts const tagCounters = {}; // Object to track instances of each tag // List of classes to be excluded const excludedClasses = ['medium-author', 'post-widget-title']; elements.forEach(el => { // Skip elements with any of the excluded classes if (excludedClasses.some(cls => el.classList.contains(cls))) { return; // Skip this element if it has any of the excluded classes } const tagName = el.tagName.toLowerCase(); // Get the tag name (e.g., 'h1', 'h2', etc.) // Initialize a counter for each tag if not already done if (!tagCounters[tagName]) { tagCounters[tagName] = 0; } // Only process the first 10 elements of each tag type if (tagCounters[tagName] >= 10) { return; // Skip if the number of elements exceeds 10 } const text = el.textContent.trim(); // Get the text content const words = text.split(/\s+/); // Split the text into words if (words.length >= 4) { // Ensure at least 4 words const significantPart = words.slice(0, 5).join(' '); // Get first 5 words for matching // Check if the text (not the tag) has been seen before if (seenTexts.has(significantPart)) { // console.log('Duplicate found, removing:', el); // Log duplicate el.remove(); // Remove duplicate element } else { seenTexts.add(significantPart); // Add the text to the set } } tagCounters[tagName]++; // Increment the counter for this tag }); } removeDuplicateText(); //Remove unnecessary text in blog excerpts document.querySelectorAll('.blog p').forEach(function(paragraph) { // Replace the unwanted text pattern for each paragraph paragraph.innerHTML = paragraph.innerHTML .replace(/Author\(s\): [\w\s]+ Originally published on Towards AI\.?/g, '') // Removes 'Author(s): XYZ Originally published on Towards AI' .replace(/This member-only story is on us\. Upgrade to access all of Medium\./g, ''); // Removes 'This member-only story...' }); //Load ionic icons and cache them if ('localStorage' in window && window['localStorage'] !== null) { const cssLink = 'https://code.ionicframework.com/ionicons/2.0.1/css/ionicons.min.css'; const storedCss = localStorage.getItem('ionicons'); if (storedCss) { loadCSS(storedCss); } else { fetch(cssLink).then(response => response.text()).then(css => { localStorage.setItem('ionicons', css); loadCSS(css); }); } } function loadCSS(css) { const style = document.createElement('style'); style.innerHTML = css; document.head.appendChild(style); } //Remove elements from imported content automatically function removeStrongFromHeadings() { const elements = document.querySelectorAll('h1, h2, h3, h4, h5, h6, span'); elements.forEach(el => { const strongTags = el.querySelectorAll('strong'); strongTags.forEach(strongTag => { while (strongTag.firstChild) { strongTag.parentNode.insertBefore(strongTag.firstChild, strongTag); } strongTag.remove(); }); }); } removeStrongFromHeadings(); "use strict"; window.onload = () => { /* //This is an object for each category of subjects and in that there are kewords and link to the keywods let keywordsAndLinks = { //you can add more categories and define their keywords and add a link ds: { keywords: [ //you can add more keywords here they are detected and replaced with achor tag automatically 'data science', 'Data science', 'Data Science', 'data Science', 'DATA SCIENCE', ], //we will replace the linktext with the keyword later on in the code //you can easily change links for each category here //(include class="ml-link" and linktext) link: 'linktext', }, ml: { keywords: [ //Add more keywords 'machine learning', 'Machine learning', 'Machine Learning', 'machine Learning', 'MACHINE LEARNING', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, ai: { keywords: [ 'artificial intelligence', 'Artificial intelligence', 'Artificial Intelligence', 'artificial Intelligence', 'ARTIFICIAL INTELLIGENCE', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, nl: { keywords: [ 'NLP', 'nlp', 'natural language processing', 'Natural Language Processing', 'NATURAL LANGUAGE PROCESSING', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, des: { keywords: [ 'data engineering services', 'Data Engineering Services', 'DATA ENGINEERING SERVICES', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, td: { keywords: [ 'training data', 'Training Data', 'training Data', 'TRAINING DATA', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, ias: { keywords: [ 'image annotation services', 'Image annotation services', 'image Annotation services', 'image annotation Services', 'Image Annotation Services', 'IMAGE ANNOTATION SERVICES', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, l: { keywords: [ 'labeling', 'labelling', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, pbp: { keywords: [ 'previous blog posts', 'previous blog post', 'latest', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, mlc: { keywords: [ 'machine learning course', 'machine learning class', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, }; //Articles to skip let articleIdsToSkip = ['post-2651', 'post-3414', 'post-3540']; //keyword with its related achortag is recieved here along with article id function searchAndReplace(keyword, anchorTag, articleId) { //selects the h3 h4 and p tags that are inside of the article let content = document.querySelector(`#${articleId} .entry-content`); //replaces the "linktext" in achor tag with the keyword that will be searched and replaced let newLink = anchorTag.replace('linktext', keyword); //regular expression to search keyword var re = new RegExp('(' + keyword + ')', 'g'); //this replaces the keywords in h3 h4 and p tags content with achor tag content.innerHTML = content.innerHTML.replace(re, newLink); } function articleFilter(keyword, anchorTag) { //gets all the articles var articles = document.querySelectorAll('article'); //if its zero or less then there are no articles if (articles.length > 0) { for (let x = 0; x < articles.length; x++) { //articles to skip is an array in which there are ids of articles which should not get effected //if the current article's id is also in that array then do not call search and replace with its data if (!articleIdsToSkip.includes(articles[x].id)) { //search and replace is called on articles which should get effected searchAndReplace(keyword, anchorTag, articles[x].id, key); } else { console.log( `Cannot replace the keywords in article with id ${articles[x].id}` ); } } } else { console.log('No articles found.'); } } let key; //not part of script, added for (key in keywordsAndLinks) { //key is the object in keywords and links object i.e ds, ml, ai for (let i = 0; i < keywordsAndLinks[key].keywords.length; i++) { //keywordsAndLinks[key].keywords is the array of keywords for key (ds, ml, ai) //keywordsAndLinks[key].keywords[i] is the keyword and keywordsAndLinks[key].link is the link //keyword and link is sent to searchreplace where it is then replaced using regular expression and replace function articleFilter( keywordsAndLinks[key].keywords[i], keywordsAndLinks[key].link ); } } function cleanLinks() { // (making smal functions is for DRY) this function gets the links and only keeps the first 2 and from the rest removes the anchor tag and replaces it with its text function removeLinks(links) { if (links.length > 1) { for (let i = 2; i < links.length; i++) { links[i].outerHTML = links[i].textContent; } } } //arrays which will contain all the achor tags found with the class (ds-link, ml-link, ailink) in each article inserted using search and replace let dslinks; let mllinks; let ailinks; let nllinks; let deslinks; let tdlinks; let iaslinks; let llinks; let pbplinks; let mlclinks; const content = document.querySelectorAll('article'); //all articles content.forEach((c) => { //to skip the articles with specific ids if (!articleIdsToSkip.includes(c.id)) { //getting all the anchor tags in each article one by one dslinks = document.querySelectorAll(`#${c.id} .entry-content a.ds-link`); mllinks = document.querySelectorAll(`#${c.id} .entry-content a.ml-link`); ailinks = document.querySelectorAll(`#${c.id} .entry-content a.ai-link`); nllinks = document.querySelectorAll(`#${c.id} .entry-content a.ntrl-link`); deslinks = document.querySelectorAll(`#${c.id} .entry-content a.des-link`); tdlinks = document.querySelectorAll(`#${c.id} .entry-content a.td-link`); iaslinks = document.querySelectorAll(`#${c.id} .entry-content a.ias-link`); mlclinks = document.querySelectorAll(`#${c.id} .entry-content a.mlc-link`); llinks = document.querySelectorAll(`#${c.id} .entry-content a.l-link`); pbplinks = document.querySelectorAll(`#${c.id} .entry-content a.pbp-link`); //sending the anchor tags list of each article one by one to remove extra anchor tags removeLinks(dslinks); removeLinks(mllinks); removeLinks(ailinks); removeLinks(nllinks); removeLinks(deslinks); removeLinks(tdlinks); removeLinks(iaslinks); removeLinks(mlclinks); removeLinks(llinks); removeLinks(pbplinks); } }); } //To remove extra achor tags of each category (ds, ml, ai) and only have 2 of each category per article cleanLinks(); */ //Recommended Articles var ctaLinks = [ /* ' ' + '

Subscribe to our AI newsletter!

' + */ '

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

'+ '

Towards AI has published Building LLMs for Production—our 470+ page guide to mastering LLMs with practical projects and expert insights!

' + '
' + '' + '' + '

Note: Content contains the views of the contributing authors and not Towards AI.
Disclosure: This website may contain sponsored content and affiliate links.

' + 'Discover Your Dream AI Career at Towards AI Jobs' + '

Towards AI has built a jobs board tailored specifically to Machine Learning and Data Science Jobs and Skills. Our software searches for live AI jobs each hour, labels and categorises them and makes them easily searchable. Explore over 10,000 live jobs today with Towards AI Jobs!

' + '
' + '

🔥 Recommended Articles 🔥

' + 'Why Become an LLM Developer? Launching Towards AI’s New One-Stop Conversion Course'+ 'Testing Launchpad.sh: A Container-based GPU Cloud for Inference and Fine-tuning'+ 'The Top 13 AI-Powered CRM Platforms
' + 'Top 11 AI Call Center Software for 2024
' + 'Learn Prompting 101—Prompt Engineering Course
' + 'Explore Leading Cloud Providers for GPU-Powered LLM Training
' + 'Best AI Communities for Artificial Intelligence Enthusiasts
' + 'Best Workstations for Deep Learning
' + 'Best Laptops for Deep Learning
' + 'Best Machine Learning Books
' + 'Machine Learning Algorithms
' + 'Neural Networks Tutorial
' + 'Best Public Datasets for Machine Learning
' + 'Neural Network Types
' + 'NLP Tutorial
' + 'Best Data Science Books
' + 'Monte Carlo Simulation Tutorial
' + 'Recommender System Tutorial
' + 'Linear Algebra for Deep Learning Tutorial
' + 'Google Colab Introduction
' + 'Decision Trees in Machine Learning
' + 'Principal Component Analysis (PCA) Tutorial
' + 'Linear Regression from Zero to Hero
'+ '

', /* + '

Join thousands of data leaders on the AI newsletter. It’s free, we don’t spam, and we never share your email address. Keep up to date with the latest work in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

',*/ ]; var replaceText = { '': '', '': '', '
': '
' + ctaLinks + '
', }; Object.keys(replaceText).forEach((txtorig) => { //txtorig is the key in replacetext object const txtnew = replaceText[txtorig]; //txtnew is the value of the key in replacetext object let entryFooter = document.querySelector('article .entry-footer'); if (document.querySelectorAll('.single-post').length > 0) { //console.log('Article found.'); const text = entryFooter.innerHTML; entryFooter.innerHTML = text.replace(txtorig, txtnew); } else { // console.log('Article not found.'); //removing comment 09/04/24 } }); var css = document.createElement('style'); css.type = 'text/css'; css.innerHTML = '.post-tags { display:none !important } .article-cta a { font-size: 18px; }'; document.body.appendChild(css); //Extra //This function adds some accessibility needs to the site. function addAlly() { // In this function JQuery is replaced with vanilla javascript functions const imgCont = document.querySelector('.uw-imgcont'); imgCont.setAttribute('aria-label', 'AI news, latest developments'); imgCont.title = 'AI news, latest developments'; imgCont.rel = 'noopener'; document.querySelector('.page-mobile-menu-logo a').title = 'Towards AI Home'; document.querySelector('a.social-link').rel = 'noopener'; document.querySelector('a.uw-text').rel = 'noopener'; document.querySelector('a.uw-w-branding').rel = 'noopener'; document.querySelector('.blog h2.heading').innerHTML = 'Publication'; const popupSearch = document.querySelector$('a.btn-open-popup-search'); popupSearch.setAttribute('role', 'button'); popupSearch.title = 'Search'; const searchClose = document.querySelector('a.popup-search-close'); searchClose.setAttribute('role', 'button'); searchClose.title = 'Close search page'; // document // .querySelector('a.btn-open-popup-search') // .setAttribute( // 'href', // 'https://medium.com/towards-artificial-intelligence/search' // ); } // Add external attributes to 302 sticky and editorial links function extLink() { // Sticky 302 links, this fuction opens the link we send to Medium on a new tab and adds a "noopener" rel to them var stickyLinks = document.querySelectorAll('.grid-item.sticky a'); for (var i = 0; i < stickyLinks.length; i++) { /* stickyLinks[i].setAttribute('target', '_blank'); stickyLinks[i].setAttribute('rel', 'noopener'); */ } // Editorial 302 links, same here var editLinks = document.querySelectorAll( '.grid-item.category-editorial a' ); for (var i = 0; i < editLinks.length; i++) { editLinks[i].setAttribute('target', '_blank'); editLinks[i].setAttribute('rel', 'noopener'); } } // Add current year to copyright notices document.getElementById( 'js-current-year' ).textContent = new Date().getFullYear(); // Call functions after page load extLink(); //addAlly(); setTimeout(function() { //addAlly(); //ideally we should only need to run it once ↑ }, 5000); }; function closeCookieDialog (){ document.getElementById("cookie-consent").style.display = "none"; return false; } setTimeout ( function () { closeCookieDialog(); }, 15000); console.log(`%c 🚀🚀🚀 ███ █████ ███████ █████████ ███████████ █████████████ ███████████████ ███████ ███████ ███████ ┌───────────────────────────────────────────────────────────────────┐ │ │ │ Towards AI is looking for contributors! │ │ Join us in creating awesome AI content. │ │ Let's build the future of AI together → │ │ https://towardsai.net/contribute │ │ │ └───────────────────────────────────────────────────────────────────┘ `, `background: ; color: #00adff; font-size: large`); //Remove latest category across site document.querySelectorAll('a[rel="category tag"]').forEach(function(el) { if (el.textContent.trim() === 'Latest') { // Remove the two consecutive spaces (  ) if (el.nextSibling && el.nextSibling.nodeValue.includes('\u00A0\u00A0')) { el.nextSibling.nodeValue = ''; // Remove the spaces } el.style.display = 'none'; // Hide the element } }); // Add cross-domain measurement, anonymize IPs 'use strict'; //var ga = gtag; ga('config', 'G-9D3HKKFV1Q', 'auto', { /*'allowLinker': true,*/ 'anonymize_ip': true/*, 'linker': { 'domains': [ 'medium.com/towards-artificial-intelligence', 'datasets.towardsai.net', 'rss.towardsai.net', 'feed.towardsai.net', 'contribute.towardsai.net', 'members.towardsai.net', 'pub.towardsai.net', 'news.towardsai.net' ] } */ }); ga('send', 'pageview'); -->