UCC AI Quest 2023

Organized by ucc-ai-quest - Current server time: April 27, 2024, 2:04 p.m. UTC

Previous

Private Test
Jan. 29, 2024, noon UTC

Current

Challenge
Dec. 15, 2023, noon UTC

End

Competition Ends
Feb. 8, 2024, noon UTC

UCC AI Quest 2023

Working on AI solutions together

Cork is blessed with breathtaking landscapes and serene greenery. This year, UCC AI Quest will focus on stunning aerial images of a high-resolution camera to recognise vegetation patches in Irish natural places. The challenge aims to foster the development of reliable artificial intelligence models with the goal of informing sustainable development. It includes the release of a new dataset of realistic drone images for benchmarking semantic segmentation from various above ground levels. There are a number of awards for the best team (€5,000), the most creative solution (€1,000) and the top women of influence (€1,000).  

 

Read more about the competition on our website.

 

Result Submission 

In each phase, the participants are required to develop models to predict high vegetation (class 1) vs background (class 0).

The submission file must be results.zip that contains a results.json file. The example format of the results.json file is given here and also listed below

{
    "DJI_0042_342.JPG": {
            "counts": [4, 10, 4, 101, 203, ...],
            "height": 384,
            "width": 640
    },

    "DJI_0062_275.JPG": {
            "counts": [8, 20, 400, 101, 203, ...],
            "height": 380,
            "width": 740

     },

    ...
}

Here, the field "counts" is the Run-length encoding (RLE) of the predicted binary mask. For example, if the RGB image's dimension is 348 x 512 x 3 (w x h x c), the predicted mask should be a 2 dimension array 348 x 512 and the mask can be converted to an RLE array using the sample Python code below:

=====
 
def mask_to_rle(mask: np.ndarray):
    """
    Convert a binary mask to RLE format.
    :param mask: numpy array, 1 - mask, 0 - background
    :return: RLE array
    """
    pixels = mask.T.flatten()
    pixels = np.concatenate([[0], pixels, [0]])
    runs = np.where(pixels[1:] != pixels[:-1])[0] + 1
    runs[1::2] -= runs[::2]
    return [int(x) for x in runs]

=====

Evaluation Metric

The results shall be evaluated using Intersection over Union (IoU) for high vegetation (class 1).

Example Code and Baseline

 https://colab.research.google.com/drive/1wYK6JKU470jUQxJ8O8R2Vx1xVnJjAjpd

General rules

  • Right to cancel, modify, or disqualify. The Competition Organizer reserves the right at its sole discretion to terminate, modify, or suspend the competition.

  • By submitting results to this competition, you consent to the public release of your scores at the Competition workshop and in the associated proceedings, at the task organizers' discretion. Scores may include but are not limited to, automatic and manual quantitative judgments, qualitative judgments, and such other metrics as the task organizers see fit. You accept that the ultimate decision of metric choice and score value is that of the task organizers.

  • By joining the competition, you accepted to the terms and conditions of Terms of Participation and Data Use Agreement of UCC AI Quest 2023, which has been sent to your email.
  • By joining the competition, you affirm and acknowledge that you agree to comply with applicable laws and regulations, and you may not infringe upon any copyrights, intellectual property, or patent of another party for the software you develop in the course of the competition, and will not breach of any applicable laws and regulations related to export control and data privacy and protection.

  • Prizes are subject to the Competition Organizer’s review and verification of the entrant’s eligibility and compliance with these rules as well as the compliance of the winning submissions with the submission requirements.

  • Participants grant to the Competition Organizer the right to use your winning submissions and the source code and data created for and used to generate the submission for any purpose whatsoever and without further approval.

  • External data is NOT allowed.

Eligibility

  • Each participant must create a AIHub account to submit their solution for the competition. Only one account per user is allowed.

  • The competition is public, but the Competition Organizer may elect to disallow participation according to its own considerations.

  • The Competition Organizer reserves the right to disqualify any entrant from the competition if, in the Competition Organizer’s sole discretion, it reasonably believes that the entrant has attempted to undermine the legitimate operation of the competition through cheating, deception, or other unfair playing practices.

Team

  • Participants are allowed to form teams. 

  • You may not participate in more than one team. Each team member must be a single individual operating a separate AIHub account. 

Submission

  • Maximum number of submissions in each phase:

    • Warm Up: 10 submissions / day / team
    • Challenge: 10 submissions / day / team
    • Private Test: 5 submissions / day / team
  • Submissions are void if they are in whole or part illegible, incomplete, damaged, altered, counterfeit, obtained through fraudulent means, or late. The Competition Organizer reserves the right, in its sole discretion, to disqualify any entrant who makes a submission that does not adhere to all requirements.

Data

By downloading or by accessing the data provided by the Competition Organizer in any manner you agree to the following terms:

  • You will not distribute the data except for the purpose of non-commercial and academic-research.

  • You will not distribute, copy, reproduce, disclose, assign, sublicense, embed, host, transfer, sell, trade, or resell any portion of the data provided by the Competition Organizer to any third party for any purpose.

  • The data must not be used for providing surveillance, analyses or research that isolates a group of individuals or any single individual for any unlawful or discriminatory purpose.

  • You accept full responsibility for your use of the data and shall defend and indemnify the Competition Organizer, against any and all claims arising from your use of the data.

Warm Up

Start: Oct. 25, 2023, noon

Challenge

Start: Dec. 15, 2023, noon

Private Test

Start: Jan. 29, 2024, noon

Competition Ends

Feb. 8, 2024, noon

You must be logged in to participate in competitions.

Sign In
# Username Score
1 vantuan5644 85.32
2 nhtlongcs 85.04
3 kaylodes 84.41