About CloudCV

CloudCV began in the summer of 2013 as a research project within the Machine Learning and Perception lab at Virginia Tech (now at Georgia Tech), with the ambitious goal of making platforms to make AI research more reproducible. We’re a young community working towards enabling developers, researchers, and fellow students to build, compare and share state-of-the-art Artificial Intelligence algorithms. We believe that one shouldn’t have to be an AI expert to have access to cutting edge vision algorithms. Likewise, researchers shouldn’t have to worry about building a service around their deep learning models to showcase and share it with others.

We have participated in the past eight installments of Google Summer of Code, over the course of which our students built several excellent tools and features. If you are interested in participating as a student or mentor, scroll down to check out our projects and get involved! We are more than happy to answer any questions you may have regarding CloudCV, so feel free to reach out to us on our Slack workspace or on our mailing list.


Admin Tools Enhancement and Cost Optimization

Angular Django AWS Python

This project aims to elevate the admin experience on EvalAI while implementing cost-effective measures for platform maintenance. Key focuses include enhancing automation for managing expired submissions on SQS queues, identifying and optimizing ECS instances based on AWS health metrics, and providing convenient admin actions through Django administration. Cost optimization measures involve using custom SQS queue retention times, refining auto-cancel scripts, and identifying/removing unnecessary AWS instances and repositories. Additionally, the project aims to automate infrastructure monitoring for improved efficiency, making EvalAI administration seamless and cost-efficient.

Project Size: Medium (175 hours)

Difficulty Rating: Medium

Please wait for loading the projects ...


Challenge Synchronization with GitHub Repositories

Django Python GitHub API

This project focuses on streamlining the migration of legacy challenges on EvalAI by transitioning from challenge zip files to GitHub repositories. It involves creating GitHub repositories for hosts, copying over relevant files, and ensuring compatibility for various challenge types.

Additionally, the bidirectional sync functionality aims to seamlessly update information between EvalAI and GitHub, enabling hosts to make edits on EvalAI that reflect on GitHub repositories and vice versa. By achieving enhanced compatibility and synchronization, the project aims to provide a more efficient experience for hosts managing challenges on EvalAI.

Project Size: Large (350 hours)

Difficulty Rating: Hard

Please wait for loading the projects ...


Enhanced Exception Handling Testing Documentation

Django Markdown Python

This project aims to enhance EvalAI’s overall user experience by implementing robust exception handling, improving test coverage, and enhancing documentation.

Key deliverables include increased test coverage for critical components, especially API-related ones, and integration of Postman API testing. Documentation improvements encompass detailed API summaries, AWS setup tutorials for organizers, and FAQs. Additionally, the project focuses on thorough error analysis, enhancing error messages for quicker issue resolution, and ensuring proper exception handling in AWS utilities to contribute to EvalAI’s reliability and user-centricity.

Project Size: Medium (175 hours)

Difficulty Rating: Easy

Please wait for loading the projects ...


Seamless User Experience & Leaderboard Porting

Angular Django SQL AWS

This project focuses on elevating the host experience on EvalAI through the introduction of user-centric features and optimizations. Key features include enabling custom requests for code-upload challenges on the EKS platform and streamlining the migration of leaderboards from external sources to EvalAI.

The project also aims to enhance user-centric aspects by implementing customizable participant details download, improving error handling for email verification, and embedding forums on challenge pages for seamless communication.

Additionally, it addresses technical improvements like fixing search and filter features, automating challenge category/forum creation, and resolving issues with pod log population on AWS. Overall, these deliverables aim to provide users with a more efficient and enriched participation experience on EvalAI.

Project Size: Medium (175 hours)

Difficulty Rating: Medium

Please wait for loading the projects ...

Become a Mentor!

Mentoring is very important to the future of CloudCV. It introduces new people to the world of open source software who will enrich our community with their ideas and talents.

Apart from technical skills, being a mentor requires your time, a clear roadmap for your project and good organization skills. If you think you would be a good fit to mentor one of our projects, do reach out to us!


Okay, let's mentor!

Have your own idea? Add an issue to our GSoC-Ideas repository.

In case of queries, you can contact us.

Email: team@cloudcv.org