This article makes the assumption that you’ve already chosen a programming language, done some date exploration and practised working with data sets... and that you've only recently discovered Kaggle. 

How to get started: 

In a nutshell, use the Getting Started competition category.  

Fun Fact – the getting started categories have a two-month rolling leader board. This gives new Kagglers the ability to see how their scores match up against other current beginner competitors, rather than all Kaggle users ever. 

1.    Titanic: Machine Learning from Disaster

Whether you’re new to Machine Learning and Data Science, or you’re after a gentle introduction to Kaggle Prediction Competitions, this is definitely your first stop. 

This competition allows you to practice your Binary Classification and Python and R basic skills. 

The challenge asks you to complete the analysis of the types of people likely to survive the sinking of the Titanic and then by applying machine learning tools predict which passengers survived the wreck.

2.    Digit Recognizer

This competition challenges you to correctly identify digits from a dataset of 10’s of 1000’s of handwritten images and is the go-to if you’ve got some experience with R, Python and Machine Learning, but are a total newbie to computer vision. Or if you want a fantastic introduction to neural networks using a classic dataset that includes pre-extracted features. 

This competition allows you to practice computer vision fundamentals like simple neural networks, and classification methods like k-NN and SVM.

3.    Housing Prices: Advanced Regression Techniques

This is the competition for you if you’ve got some decent experience with Python, R, and Machine Learning basics. It’s also a pretty good last stop before moving onto the featured competitions in the Playground. 

This competition asks you to predict the final cost of residential homes in Ames, Iowa using 79 explanatory variables which describe pretty much every aspect of a home. 

You’ll get to practice your creative feature engineering and some more advanced regression techniques like gradient boosting and random forest. 


Once you’re done with the Getting Started category, it’s time to move onto the Playground category. These competitions will push you a bit harder than the Getting Startedones, and are definitely the “for fun” kind rather than the sort that come with big prizes.

Recommended Playground Challenges: 

1.    Facial Keypoints Detection

Objective:to predict keypoint positions on face images

2.    Sentiment Analysis of Movie Reviews

Objective:label phrases in movie reviews on a scale of five values: negative, somewhat negative, neutral, somewhat positive, positive

3.    Random Acts of Pizza

Objectives: create an algorithm capable of predicting which Reddit requests will garner pizza kindness (I know awful pun).

4.    Bike Sharing Demand

Objective:combine historical usage patterns with weather data in order to forecast bike rental demand in the Capital Bikeshare program in Washington, D.C.

5.    Forest Cover Type Prediction

Objective:predict the predominant kind of tree cover from cartographic variables. The data is in raw form and contains binary columns of data for qualitative independent variables.

6.    Leaf Classification

Objective: use binary leaf images and extracted features, including shape, margin & texture, to accurately identify 99 species of plants.


And finally, if you’re finding all of the above too easy here are some Featured and Research competitions that will push you… hard… 


1.    Allstate Claims Severity

2.    Toxic Comment Classification Challenge 

3.    Zillow’s Home Value Prediction


1.    Google Landmark Recognition

2.    Right Whale Recognition

3.    Large Scale Hierarchical Text Classification 

So, there you have it, a brief road-map to kickstarting your Kaggle career. So, have fun, explore, get it wrong, and Kaggle on. 

Recent Blogs