Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: pub@towardsai.net
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

Publication

Building A Robust and Efficient AWS Cloud Infrastructure with Terraform and GitLab CI/CD.
Latest   Machine Learning

Building A Robust and Efficient AWS Cloud Infrastructure with Terraform and GitLab CI/CD.

Last Updated on October 5, 2024 by Editorial Team

Author(s): Julius Nyerere Nyambok

Originally published on Towards AI.

Historically, cloud infrastructure management involved manual configuration on web consoles or command line interfaces. This approach was prone to human errors, inconsistencies, and maintaining version controls. The growing complexity of cloud environments and the demand for faster, more reliable, and reproducible infrastructure management practices highlighted the need for a more efficient solution.
Infrastructure-as-code (IaC) is a DevOps practice that uses code to define and deploy infrastructure. Terraform by HarshiCorp is an IaC tool that allows you to define and provision cloud resources using a declarative language called HashiCorp Configuration Language.
In this article, we will deploy resources on AWS through Terraform and create a CI/CD pipeline on Gitlab to automate the deployment process.

Figure 1: Terraform basic flowchart

Part I: Introduction

In this project, we will define the AWS infrastructure, write terraform code that defines our AWS infrastructure, build our infrastructure, and automate our infrastructure creation using GitLab CI/CD pipelines so that when a change is made, the pipeline will run the terraform commands, and update the infrastructure. You require the following tools for this project:

  1. AWS account and a user account — Preferred cloud computing resources provider that offers a free tier.
  2. AWS CLI A command line interface to authenticate our AWS credentials.
  3. TerraformInfrastructure-as-code tool to deploy cloud resources via code. You can follow this tutorial to install it.
  4. GitLab accountTo store our code in a repository and create our CI/CD pipeline.
  5. Any code editor you prefer i.e VS Code.

Here is the link to the GitLab repository which I have successfully mirrored on my GitHub.

GitHub – Jnyambok/Terraform-CI-CD-Pipeline: AWS infrastructure that consists of a VPC and Amazon…

AWS infrastructure consists of a VPC and Amazon EC2 instance deployed through Terraform.Repository mirrored from…

github.com

Part II: Infrastructure Definition

A Virtual Private Cloud (VPC) is a private, isolated section of the AWS cloud where you can launch resources. It’s akin to a private data center within the public cloud that allows you to customize the configuration, including subnets, routing tables, and security groups.
An Elastic Compute Cloud (EC2) instance is a virtual server in the cloud that provides on-demand computing capacity and resources like CPU, memory, and storage.
A security group is a firewall configuration for your services that defines what ports on the machine are open to incoming traffic.

Figure 2: Our basic AWS infrastructure

Imagine you want to create an application on AWS. You would first create a VPC to provide a private network for your web application. Then, you would launch EC2 instances within the VPC to run your application. The VPC, through a security group, would define the network configurations for the EC2 instances to ensure they communicate with each other and the outside world. This infrastructure is what we will build.

An Amazon Machine Image (AMI) is a template for creating EC2 instances. It contains the software and configuration information required to launch an instance. Think of it as a pre-packaged set of instructions for building a virtual server.

Figure 3: AMIs in action

Part III: Terraform structure definition and configuration.

Terraform projects are typically structured like this:

Figure 4: Terraform structure definition

In Terraform, modules are reusable blocks of infrastructure code that encapsulate and organize related resources into a single unit making your configurations more modular. Our VPC and EC2 configurations are defined within folders within our project. These are our modules. We have three main files that are defined in the root module.

  1. main. tf — This is the primary Terraform configuration file. When the file is within a module, it defines the resources you want to provision i.e virtual machines, databases, and containers. When the file is in the root folder, it acts as a messenger between modules to pass vital information.
  2. provider. tf (optional) — This file configures Terraform providers to interact with specific cloud platforms or services.
  3. variable. tf (optional) — This file helps you define reusable variables with types and optional default values. It’s useful if you have a large cloud infrastructure.

I have provided this basic template for this project on my GitHub. Go ahead and git pull this repository. Go through Terraform’s basic syntax to understand.

Let’s begin by configuring our Virtual Private Cloud. Navigate to your /vpc/main.tf and paste this block.

##We will create 1 vpc , 1 subnet and 1 security group


# A VPC is a private, isolated section of the AWS cloud where you can launch resources
resource "aws_vpc" "myvpc" {
cidr_block = "10.0.0.0/16"
enable_dns_hostnames = true
enable_dns_support = true

tags = {
Name = "myvpc"

}

}


#A subnet is a division of a VPC that defines a range of IP addresses.
resource "aws_subnet" "pb_sn" {
vpc_id = aws_vpc.myvpc.id
cidr_block = "10.0.1.0/24"
map_public_ip_on_launch = true
availability_zone = "eu-north-1a"


tags = {
Name = "pb_sn1"
}


}


#A security group is a virtual firewall that controls inbound and outbound traffic to resources within a VPC.
resource "aws_security_group" "sg" {
vpc_id = aws_vpc.myvpc.id
name = "my_sg"
description = "Public Security"


# Ingress refers to incoming traffic to a resource within the VPC. It specifies which ports and protocols can be accessed from outside the VPC.
#This rule allows inbound SSH traffic (port 22) from any IP addres
ingress {
from_port = 22
to_port = 22
protocol = "tcp"
cidr_blocks = ["0.0.0.0/0"]
}


#Egress refers to outgoing traffic from a resource within the VPC. It specifies which ports and protocols can be accessed from within the VPC

egress {
from_port = 0
to_port = 0
protocol = "-1" #This specifies that the rule applies to all protocols (TCP, UDP, ICMP, etc.).
cidr_blocks = ["0.0.0.0/0"] #This indicates that the rule applies to all destination IP addresses (the entire internet
}

}


#In essence, this rule grants the security group complete outbound connectivity, allowing it to communicate with any resource on the internet. This might be useful for certain scenarios, but it's generally considered a security risk as it exposes the resources within the security group to potential threats

We have configured a VPC, a subnet that provides a range of IP addresses, and a security group. Let’s configure our EC2. Navigate to your /ec2/main.tf and paste this block.


#An AMI (Amazon Machine Image) is a template for creating EC2 instances. It contains the software and configuration information required to launch an instance


resource "aws_instance" "server" {
ami = "<find-a-suitable-ami"
instance_type = "m5.large" #free tier limits
subnet_id = ""
security_groups = ""

tags ={
Name = "myserver"
}


}

You require an AMI ID that fits the basic requirements for this infrastructure. I found a good AMI to use from the AMI Catalog on AWS. Copy the AMI ID (ami — xxxxxxxx). Paste it on the ami tag in your code.

Figure 5: A good AMI to use

We require a subnet ID and the security group from our vpc module to use in our EC2 module. We will go to our /vpc/output.tf. and paste the following block.

output "pb_sn" {
value = aws_subnet.pb_sn.id
}

output "sg" {
value = aws_security_group.sg.id

}

This file suggests that there are variables we are expecting from our VPC’s main configuration file and we pass them to other modules. In /ec2/variables.tf, we define the variables expected from the other modules. Paste this block:

variable "sg"{

}


variable "sn" {

}

The main. tf in our root folder is where we pass important information between the two modules. We pass the variables we created in our /vpc/output.tf and assign them to the variables in /ec2/variables.tf. Navigate to your root folder’s main. tf and paste:

#This is the route module where will be passing all of the modules


module "vpc" {
source = "./vpc"
}


module "ec2" {
source = "./ec2"
sn = module.vpc.pb_sn
sg = module.vpc.sg
}

State management is vital when working with Infrastructure as Code. State management prevents simultaneous state file writes, ensuring data integrity, synchronization, performance, and consistency. Terraform state is a JSON file recording the configuration of your infrastructure, enabling Terraform to track changes and maintain consistency. Usually, terraform requests a lock, checks for existing locks, and applies changes if no lock exists. We do this in a file called the backend. tf.

Imagine you’re working in a team of developers on the same infrastructure. Naturally, shared access is paramount when it comes to these locks by the team for versioning and backup in case a state isn’t configured and a roll-back is required. We can do this with an AWS S3 Bucket which acts as the storage location for your Terraform state file (terraform. tfstate), which holds metadata about your infrastructure resources and Dynamo DB, which provides a locking mechanism to prevent simultaneous modifications.

Navigate to your AWS homepage and search for s3. Create an S3 bucket as shown below.

Figure 5: Creating a general purpose S3 bucket

Create a folder within the s3 bucket and give it a name that will act as our state in our ./backend.tf file

Figure 6: Creating a folder (state)

Navigate to your AWS homepage and search for DynamoDB. Create a table with default configurations as shown below. A mistake I made was when naming the partition key. Unless configured manually, the partition key is set to default as “LockID”. Name your partition key “LockID” and save it.

Figure 7: Creating your DynamoDB table.

In our root, create a backend.tf file and paste the following block:

terraform {
backend "s3" {
bucket = "mystatebucket99" #your S3 bucket's name
key = "state" #the folder you created in your S3
region = "eu-north-1" #your aws region
dynamodb_table = "mydynamotable" #your dynamo DB table name
}

}

We require the necessary permissions to write into our newly created resources. Navigate to your IAM and the following permissions as shown below:
– AmazonEC2FullAccess
– AmazonDynamoDBFullAccess
– AmazonS3FullAccess

Figure 8: Adding permissions

Once that is set up, your project will look like this:

Figure 9: Final project structure

Part IV: Project initialization

We need to connect to our AWS account through the CLI. Navigate to your user profile and take note of your access credentials. Navigate to your CLI and enter the required credentials:

aws configure - you will be prompted to enter your credentials

The following terraform commands are what we will use in our pipeline to configure :

  1. terraform init — Performs backend initialization, child module installation, and plugin installation.
  2. terraform validate — Validates the configuration files in your directory and does not access any remote services.
  3. terraform plan — Shows what actions will be taken without actually performing the planned actions.
  4. terraform apply -auto-approve — Create or update infrastructure depending on the configuration files. Adding auto-approve applies changes without having to type “yes” to the plan
  5. terraform destroy — Destroy the infrastructure.
Figure 10: Terraform Init

Part V: CI/CD pipeline creation and deployment

We will create a CI/CD pipeline on GitLab for automation. Create a GitLab repository (make it public) and copy the link to the repository. Add a .gitignore and copy these contents into it. Navigate to your CLI and push your code to your repository.

git remote add origin <your-repository-link>
git remote -v -- used to view the configured remote repositories for your Git project


git checkout -b mlops -- creating a branch
git commit -m " Initial commit"
git push -u origin mlops

It’s always a good practice to push your code to a branch and merge later.

Figure 11: Merging your branch to the main

We need to create two variables on GitLab from AWS: an access key and a secret access key. Navigate to your IAM -> Users -> your created user -> Security Credentials -> Access keys. Proceed to generate an access key.

Figure 12: Creating your user access keys
Figure 13: Created AWS access keys

Navigate to your GitLab homepage and save the keys you created as shown below. MY_AWS_KEY is the Access key while the MY_AWS_ACCESS_KEY is the Secret Access key.

Figure 14: Saving your keys
Figure 15: Saved keys

Navigate to your GitLab repository. Create a new file and name it .gitlab-ci.yml. This file guides your pipeline on the necessary steps to take.

Figure 16: Creating the .gitlab-ci.yml

Once created, copy this block of code:

image:
name: registry.gitlab.com/gitlab-org/gitlab-build-images:terraform
entrypoint:
- '/usr/bin/env'
- 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'

#This code tells the GitLab CI/CD pipeline to use a Docker image containing
#Terraform and sets up the environment for the execution of Terraform commands
#within the container. This allows you to run your Terraform scripts without
#needing to install Terraform directly on the GitLab Runner machine

variables:
AWS_ACCESS_KEY_ID: ${MY_AWS_KEY}
AWS_SECRET_ACCESS_KEY : ${MY_AWS_ACCESS_KEY}
AWS_DEFAULT_REGION: "eu-north-1"

#initializes your variables

before_script:
- terraform --version
- terraform init

#instructs the steps to take before the pipeline begins running
#i.e checking the version and backend initialization


stages:
- validate
- plan
- apply
- destroy


validate:
stage: validate
script:
- terraform validate

plan:
stage: plan
script:
- terraform plan -out="planfile"
dependencies:
- validate
artifacts:
paths:
- planfile

#creates a planfile and stores it in artifacts

apply:
stage: apply
script:
- terraform apply -input=false "planfile"
dependencies:
- plan
when: manual

#the "when" only makes this stage possible after manual intervention

destroy:
stage: destroy
script:
- terraform destroy --auto-approve
when: manual

Save your file and commit your changes. You will be prompted to launch the pipeline. Launch it and voila! The magic begins.

Figure 17: Pipeline in progress

Once completed, navigate to your AWS homepage and you will see your infrastructure has been created.

Figure 18: Pipeline is succesful

We have successfully built a robust and efficient AWS Cloud Infrastructure with Terraform and GitLab CI/CD.

Thank you for reading my article.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓

Sign Up for the Course
`; } else { console.error('Element with id="subscribe" not found within the page with class "home".'); } } }); // Remove duplicate text from articles /* Backup: 09/11/24 function removeDuplicateText() { const elements = document.querySelectorAll('h1, h2, h3, h4, h5, strong'); // Select the desired elements const seenTexts = new Set(); // A set to keep track of seen texts const tagCounters = {}; // Object to track instances of each tag elements.forEach(el => { const tagName = el.tagName.toLowerCase(); // Get the tag name (e.g., 'h1', 'h2', etc.) // Initialize a counter for each tag if not already done if (!tagCounters[tagName]) { tagCounters[tagName] = 0; } // Only process the first 10 elements of each tag type if (tagCounters[tagName] >= 2) { return; // Skip if the number of elements exceeds 10 } const text = el.textContent.trim(); // Get the text content const words = text.split(/\s+/); // Split the text into words if (words.length >= 4) { // Ensure at least 4 words const significantPart = words.slice(0, 5).join(' '); // Get first 5 words for matching // Check if the text (not the tag) has been seen before if (seenTexts.has(significantPart)) { // console.log('Duplicate found, removing:', el); // Log duplicate el.remove(); // Remove duplicate element } else { seenTexts.add(significantPart); // Add the text to the set } } tagCounters[tagName]++; // Increment the counter for this tag }); } removeDuplicateText(); */ // Remove duplicate text from articles function removeDuplicateText() { const elements = document.querySelectorAll('h1, h2, h3, h4, h5, strong'); // Select the desired elements const seenTexts = new Set(); // A set to keep track of seen texts const tagCounters = {}; // Object to track instances of each tag // List of classes to be excluded const excludedClasses = ['medium-author', 'post-widget-title']; elements.forEach(el => { // Skip elements with any of the excluded classes if (excludedClasses.some(cls => el.classList.contains(cls))) { return; // Skip this element if it has any of the excluded classes } const tagName = el.tagName.toLowerCase(); // Get the tag name (e.g., 'h1', 'h2', etc.) // Initialize a counter for each tag if not already done if (!tagCounters[tagName]) { tagCounters[tagName] = 0; } // Only process the first 10 elements of each tag type if (tagCounters[tagName] >= 10) { return; // Skip if the number of elements exceeds 10 } const text = el.textContent.trim(); // Get the text content const words = text.split(/\s+/); // Split the text into words if (words.length >= 4) { // Ensure at least 4 words const significantPart = words.slice(0, 5).join(' '); // Get first 5 words for matching // Check if the text (not the tag) has been seen before if (seenTexts.has(significantPart)) { // console.log('Duplicate found, removing:', el); // Log duplicate el.remove(); // Remove duplicate element } else { seenTexts.add(significantPart); // Add the text to the set } } tagCounters[tagName]++; // Increment the counter for this tag }); } removeDuplicateText(); //Remove unnecessary text in blog excerpts document.querySelectorAll('.blog p').forEach(function(paragraph) { // Replace the unwanted text pattern for each paragraph paragraph.innerHTML = paragraph.innerHTML .replace(/Author\(s\): [\w\s]+ Originally published on Towards AI\.?/g, '') // Removes 'Author(s): XYZ Originally published on Towards AI' .replace(/This member-only story is on us\. Upgrade to access all of Medium\./g, ''); // Removes 'This member-only story...' }); //Load ionic icons and cache them if ('localStorage' in window && window['localStorage'] !== null) { const cssLink = 'https://code.ionicframework.com/ionicons/2.0.1/css/ionicons.min.css'; const storedCss = localStorage.getItem('ionicons'); if (storedCss) { loadCSS(storedCss); } else { fetch(cssLink).then(response => response.text()).then(css => { localStorage.setItem('ionicons', css); loadCSS(css); }); } } function loadCSS(css) { const style = document.createElement('style'); style.innerHTML = css; document.head.appendChild(style); } //Remove elements from imported content automatically function removeStrongFromHeadings() { const elements = document.querySelectorAll('h1, h2, h3, h4, h5, h6, span'); elements.forEach(el => { const strongTags = el.querySelectorAll('strong'); strongTags.forEach(strongTag => { while (strongTag.firstChild) { strongTag.parentNode.insertBefore(strongTag.firstChild, strongTag); } strongTag.remove(); }); }); } removeStrongFromHeadings(); "use strict"; window.onload = () => { /* //This is an object for each category of subjects and in that there are kewords and link to the keywods let keywordsAndLinks = { //you can add more categories and define their keywords and add a link ds: { keywords: [ //you can add more keywords here they are detected and replaced with achor tag automatically 'data science', 'Data science', 'Data Science', 'data Science', 'DATA SCIENCE', ], //we will replace the linktext with the keyword later on in the code //you can easily change links for each category here //(include class="ml-link" and linktext) link: 'linktext', }, ml: { keywords: [ //Add more keywords 'machine learning', 'Machine learning', 'Machine Learning', 'machine Learning', 'MACHINE LEARNING', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, ai: { keywords: [ 'artificial intelligence', 'Artificial intelligence', 'Artificial Intelligence', 'artificial Intelligence', 'ARTIFICIAL INTELLIGENCE', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, nl: { keywords: [ 'NLP', 'nlp', 'natural language processing', 'Natural Language Processing', 'NATURAL LANGUAGE PROCESSING', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, des: { keywords: [ 'data engineering services', 'Data Engineering Services', 'DATA ENGINEERING SERVICES', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, td: { keywords: [ 'training data', 'Training Data', 'training Data', 'TRAINING DATA', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, ias: { keywords: [ 'image annotation services', 'Image annotation services', 'image Annotation services', 'image annotation Services', 'Image Annotation Services', 'IMAGE ANNOTATION SERVICES', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, l: { keywords: [ 'labeling', 'labelling', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, pbp: { keywords: [ 'previous blog posts', 'previous blog post', 'latest', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, mlc: { keywords: [ 'machine learning course', 'machine learning class', ], //Change your article link (include class="ml-link" and linktext) link: 'linktext', }, }; //Articles to skip let articleIdsToSkip = ['post-2651', 'post-3414', 'post-3540']; //keyword with its related achortag is recieved here along with article id function searchAndReplace(keyword, anchorTag, articleId) { //selects the h3 h4 and p tags that are inside of the article let content = document.querySelector(`#${articleId} .entry-content`); //replaces the "linktext" in achor tag with the keyword that will be searched and replaced let newLink = anchorTag.replace('linktext', keyword); //regular expression to search keyword var re = new RegExp('(' + keyword + ')', 'g'); //this replaces the keywords in h3 h4 and p tags content with achor tag content.innerHTML = content.innerHTML.replace(re, newLink); } function articleFilter(keyword, anchorTag) { //gets all the articles var articles = document.querySelectorAll('article'); //if its zero or less then there are no articles if (articles.length > 0) { for (let x = 0; x < articles.length; x++) { //articles to skip is an array in which there are ids of articles which should not get effected //if the current article's id is also in that array then do not call search and replace with its data if (!articleIdsToSkip.includes(articles[x].id)) { //search and replace is called on articles which should get effected searchAndReplace(keyword, anchorTag, articles[x].id, key); } else { console.log( `Cannot replace the keywords in article with id ${articles[x].id}` ); } } } else { console.log('No articles found.'); } } let key; //not part of script, added for (key in keywordsAndLinks) { //key is the object in keywords and links object i.e ds, ml, ai for (let i = 0; i < keywordsAndLinks[key].keywords.length; i++) { //keywordsAndLinks[key].keywords is the array of keywords for key (ds, ml, ai) //keywordsAndLinks[key].keywords[i] is the keyword and keywordsAndLinks[key].link is the link //keyword and link is sent to searchreplace where it is then replaced using regular expression and replace function articleFilter( keywordsAndLinks[key].keywords[i], keywordsAndLinks[key].link ); } } function cleanLinks() { // (making smal functions is for DRY) this function gets the links and only keeps the first 2 and from the rest removes the anchor tag and replaces it with its text function removeLinks(links) { if (links.length > 1) { for (let i = 2; i < links.length; i++) { links[i].outerHTML = links[i].textContent; } } } //arrays which will contain all the achor tags found with the class (ds-link, ml-link, ailink) in each article inserted using search and replace let dslinks; let mllinks; let ailinks; let nllinks; let deslinks; let tdlinks; let iaslinks; let llinks; let pbplinks; let mlclinks; const content = document.querySelectorAll('article'); //all articles content.forEach((c) => { //to skip the articles with specific ids if (!articleIdsToSkip.includes(c.id)) { //getting all the anchor tags in each article one by one dslinks = document.querySelectorAll(`#${c.id} .entry-content a.ds-link`); mllinks = document.querySelectorAll(`#${c.id} .entry-content a.ml-link`); ailinks = document.querySelectorAll(`#${c.id} .entry-content a.ai-link`); nllinks = document.querySelectorAll(`#${c.id} .entry-content a.ntrl-link`); deslinks = document.querySelectorAll(`#${c.id} .entry-content a.des-link`); tdlinks = document.querySelectorAll(`#${c.id} .entry-content a.td-link`); iaslinks = document.querySelectorAll(`#${c.id} .entry-content a.ias-link`); mlclinks = document.querySelectorAll(`#${c.id} .entry-content a.mlc-link`); llinks = document.querySelectorAll(`#${c.id} .entry-content a.l-link`); pbplinks = document.querySelectorAll(`#${c.id} .entry-content a.pbp-link`); //sending the anchor tags list of each article one by one to remove extra anchor tags removeLinks(dslinks); removeLinks(mllinks); removeLinks(ailinks); removeLinks(nllinks); removeLinks(deslinks); removeLinks(tdlinks); removeLinks(iaslinks); removeLinks(mlclinks); removeLinks(llinks); removeLinks(pbplinks); } }); } //To remove extra achor tags of each category (ds, ml, ai) and only have 2 of each category per article cleanLinks(); */ //Recommended Articles var ctaLinks = [ /* ' ' + '

Subscribe to our AI newsletter!

' + */ '

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

'+ '

Towards AI has published Building LLMs for Production—our 470+ page guide to mastering LLMs with practical projects and expert insights!

' + '
' + '' + '' + '

Note: Content contains the views of the contributing authors and not Towards AI.
Disclosure: This website may contain sponsored content and affiliate links.

' + 'Discover Your Dream AI Career at Towards AI Jobs' + '

Towards AI has built a jobs board tailored specifically to Machine Learning and Data Science Jobs and Skills. Our software searches for live AI jobs each hour, labels and categorises them and makes them easily searchable. Explore over 10,000 live jobs today with Towards AI Jobs!

' + '
' + '

🔥 Recommended Articles 🔥

' + 'Why Become an LLM Developer? Launching Towards AI’s New One-Stop Conversion Course'+ 'Testing Launchpad.sh: A Container-based GPU Cloud for Inference and Fine-tuning'+ 'The Top 13 AI-Powered CRM Platforms
' + 'Top 11 AI Call Center Software for 2024
' + 'Learn Prompting 101—Prompt Engineering Course
' + 'Explore Leading Cloud Providers for GPU-Powered LLM Training
' + 'Best AI Communities for Artificial Intelligence Enthusiasts
' + 'Best Workstations for Deep Learning
' + 'Best Laptops for Deep Learning
' + 'Best Machine Learning Books
' + 'Machine Learning Algorithms
' + 'Neural Networks Tutorial
' + 'Best Public Datasets for Machine Learning
' + 'Neural Network Types
' + 'NLP Tutorial
' + 'Best Data Science Books
' + 'Monte Carlo Simulation Tutorial
' + 'Recommender System Tutorial
' + 'Linear Algebra for Deep Learning Tutorial
' + 'Google Colab Introduction
' + 'Decision Trees in Machine Learning
' + 'Principal Component Analysis (PCA) Tutorial
' + 'Linear Regression from Zero to Hero
'+ '

', /* + '

Join thousands of data leaders on the AI newsletter. It’s free, we don’t spam, and we never share your email address. Keep up to date with the latest work in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

',*/ ]; var replaceText = { '': '', '': '', '
': '
' + ctaLinks + '
', }; Object.keys(replaceText).forEach((txtorig) => { //txtorig is the key in replacetext object const txtnew = replaceText[txtorig]; //txtnew is the value of the key in replacetext object let entryFooter = document.querySelector('article .entry-footer'); if (document.querySelectorAll('.single-post').length > 0) { //console.log('Article found.'); const text = entryFooter.innerHTML; entryFooter.innerHTML = text.replace(txtorig, txtnew); } else { // console.log('Article not found.'); //removing comment 09/04/24 } }); var css = document.createElement('style'); css.type = 'text/css'; css.innerHTML = '.post-tags { display:none !important } .article-cta a { font-size: 18px; }'; document.body.appendChild(css); //Extra //This function adds some accessibility needs to the site. function addAlly() { // In this function JQuery is replaced with vanilla javascript functions const imgCont = document.querySelector('.uw-imgcont'); imgCont.setAttribute('aria-label', 'AI news, latest developments'); imgCont.title = 'AI news, latest developments'; imgCont.rel = 'noopener'; document.querySelector('.page-mobile-menu-logo a').title = 'Towards AI Home'; document.querySelector('a.social-link').rel = 'noopener'; document.querySelector('a.uw-text').rel = 'noopener'; document.querySelector('a.uw-w-branding').rel = 'noopener'; document.querySelector('.blog h2.heading').innerHTML = 'Publication'; const popupSearch = document.querySelector$('a.btn-open-popup-search'); popupSearch.setAttribute('role', 'button'); popupSearch.title = 'Search'; const searchClose = document.querySelector('a.popup-search-close'); searchClose.setAttribute('role', 'button'); searchClose.title = 'Close search page'; // document // .querySelector('a.btn-open-popup-search') // .setAttribute( // 'href', // 'https://medium.com/towards-artificial-intelligence/search' // ); } // Add external attributes to 302 sticky and editorial links function extLink() { // Sticky 302 links, this fuction opens the link we send to Medium on a new tab and adds a "noopener" rel to them var stickyLinks = document.querySelectorAll('.grid-item.sticky a'); for (var i = 0; i < stickyLinks.length; i++) { /* stickyLinks[i].setAttribute('target', '_blank'); stickyLinks[i].setAttribute('rel', 'noopener'); */ } // Editorial 302 links, same here var editLinks = document.querySelectorAll( '.grid-item.category-editorial a' ); for (var i = 0; i < editLinks.length; i++) { editLinks[i].setAttribute('target', '_blank'); editLinks[i].setAttribute('rel', 'noopener'); } } // Add current year to copyright notices document.getElementById( 'js-current-year' ).textContent = new Date().getFullYear(); // Call functions after page load extLink(); //addAlly(); setTimeout(function() { //addAlly(); //ideally we should only need to run it once ↑ }, 5000); }; function closeCookieDialog (){ document.getElementById("cookie-consent").style.display = "none"; return false; } setTimeout ( function () { closeCookieDialog(); }, 15000); console.log(`%c 🚀🚀🚀 ███ █████ ███████ █████████ ███████████ █████████████ ███████████████ ███████ ███████ ███████ ┌───────────────────────────────────────────────────────────────────┐ │ │ │ Towards AI is looking for contributors! │ │ Join us in creating awesome AI content. │ │ Let's build the future of AI together → │ │ https://towardsai.net/contribute │ │ │ └───────────────────────────────────────────────────────────────────┘ `, `background: ; color: #00adff; font-size: large`); //Remove latest category across site document.querySelectorAll('a[rel="category tag"]').forEach(function(el) { if (el.textContent.trim() === 'Latest') { // Remove the two consecutive spaces (  ) if (el.nextSibling && el.nextSibling.nodeValue.includes('\u00A0\u00A0')) { el.nextSibling.nodeValue = ''; // Remove the spaces } el.style.display = 'none'; // Hide the element } }); // Add cross-domain measurement, anonymize IPs 'use strict'; //var ga = gtag; ga('config', 'G-9D3HKKFV1Q', 'auto', { /*'allowLinker': true,*/ 'anonymize_ip': true/*, 'linker': { 'domains': [ 'medium.com/towards-artificial-intelligence', 'datasets.towardsai.net', 'rss.towardsai.net', 'feed.towardsai.net', 'contribute.towardsai.net', 'members.towardsai.net', 'pub.towardsai.net', 'news.towardsai.net' ] } */ }); ga('send', 'pageview'); -->