Skip to main content
News

Registration for OpenCCU is now open!

IARPA
Open Automatic Speech Recognition Challenge
(OpenASR21)


Contact Us
For any information about the OpenASR21 Evaluation (data, evaluation code, etc.) please email: openasr_poc@nist.gov
Welcome to the IARPA Open Automatic Speech Recognition Challenge (OpenASR21)
The OpenASR21 Challenge is the third open challenge created out of the Intelligence Advanced Research Projects Activity (IARPA) Machine Translation for English Retrieval of Information in Any Language (MATERIAL) program, which encompasses additional tasks, including cross-language information retrieval, domain classification, and summarization, and more languages. For every year of MATERIAL, the National Institute of Standards and Technology (NIST) organizes a simplified, smaller scale evaluation open to anyone wishing to participate, focusing on a particular technology aspect of MATERIAL. In 2019, Cross-Language Information Retrieval (CLIR) technologies were the focus of the open challenge, OpenCLIR. In 2020, the focus was on Automatic Speech Recognition (ASR) under low-resource language constraints for the first time. In 2021, ASR under low-resource language constraints is being offered again, but with new languages and case-sensitive scoring added.
Please visit the OpenASR21 page for the current version of the evaluation plan, results, as well as other resources and tools.
Schedule

Milestone Date
Evaluation plan release July, 2021
Registration period August 9 – October 15, 2021
Development period August 9 – November 2021 (potentially longer)
     Build and Dev datasets release August 9,2021
     Scoring server accepts submissions for Dev datasets August 30 – November 2021 (potentially longer)
Evaluation period November 3-10, 2021
     Eval datasets release November 3, 2021
     Scoring server accepts submissions for Eval datasets November 4-10, 2021
     System output due to NIST November 10, 2021
     System description due to NIST November 19, 2021
Participation
OpenASR21 is open to the public worldwide. Anyone may register to participate, apply their technologies to the OpenASR21 data, and submit their system results to the OpenASR21 evaluation server. Participation is free.

Participation Logistics
  • Register for an evaluation account (if you don’t have one already). This account allows you to make submissions, check the submission status, and receive the results.

  • Sign the participation and data usage agreement. The agreement indicates your acknowledgment to abide by the data usage terms as well as the evaluation rules set forth in the evaluation plan.

  • Once the agreement is approved, you will get an email with instructions on how to access the development data.

  • At the appropriate time, the evaluation data will be made available.

  • Follow the instructions in the evaluation plan on how to format and package your submissions.

Registration Instructions
If you already have an account, login here or at the top of the page. To create an account and register, follow the steps below.
To Create an Account:
1- Click the "Sign up" button at the top of the page.
2- Provide a valid e-mail address and create a password in the “Sign up” form. (After entering your email address and creating a password and clicking “Sign up” on the “Sign up” online form, a confirmation email will be sent to that email address).
3- Click “Confirm my account” in the e-mail sent to you. (A sign in page will display with your email address and created password already entered.)
4- Click “Sign in”. (A dashboard for your account will display with Registration Steps.)
5- Complete the steps in the dashboard to complete an account creation.
6- Registration is completed when steps 1-5 are completed.
When you are notified by email from NIST that your License Agreement is approved, you can then access the data.
Creating a Team or Joining a Site and Team
When joining OpenASR21, a participant can either create a Site, or join an existing Site and create a Team, or join an existing Team. A participant can be a member of multiple teams.
Each participant, Site, and Team will have its own password. The creator(s) of the Site and Team creates those passwords respectively.
The Data License Agreement
Site creator is required to agree to the NIST terms in order to access data for that site. Read the NIST license agreement and accept the terms by uploading the signed license agreement form. Participants cannot get data access until NIST approves your uploaded signed NIST license agreement.
The Dashboard
The dashboard is the personalized page for each participant. To access the dashboard at any time, click the "Dashboard" at the top right of the screen. This is where you can make submissions and view results.

System Output Submission Instructions
Each submission must be associated with a Site, Team and Task.
The portals of making submissions have already created.
Submit system output for validation checking or scoring following these steps:
1- Prepare for Submission
  • System output must be in the format described in the OPENASR21 Evaluation Plan.
2- Go to Dashboard. (Select "Dashboard" on the top right of the page.)
  • In "Submission Management", click the Task that represents the system output.
  • On the “Submissions” page, click “Upload” for each dataset X training condition combination. The “New Submission” page displays.
    • Click the "Choose File". Choose "Language" to upload corresponding submission.
    • Click "Submit".
    • A submission ID is automatically created for the submission.
    • The “Scoring” button on the “Submissions” page displays “submitted” until the scoring server completes scoring and then it changes to “Done”.
3- View Submission Results
  • To see a Scoring Run Report, click the “Scoring” button, after “submitted” changes to “Done” on the button.
  • To see information about the submission click the “View Submission” button.

The NIST scoring server will perform a validation check on each system output submission for conforming to the submission format required for each task.
Submission of a system description conforming to the system description guidelines in the OPENASR21 Evaluation Plan is required before receiving the system’s score and ranking results in the Evaluation phase.

Tools
FAQ
Below are answers to specific questions received that clarify or expand on the rules in the evaluation plan.

Availability of data received under the license agreement for research purposes:
The data received under the license agreement for OpenASR21 will remain available to you for research purposes until INTERSPEECH 2022 (September 2022). However, the data may not be shared outside of your OpenASR21 team during that time.

Pretrained model availability date for Constrained-plus training:
The August 9 cutoff for public availability of pretrained models permissible for Constrained-plus training is firm. Any pretrained models developed or made public after, this date are only permissible for Unconstrained training.

Tuning on evaluation dataset at runtime:
Tuning on evaluation data even at evaluation data runtime is not permitted for official OpenASR21 submissions. This does not preclude research involving such tuning for other experiments while you still have the OpenASR21 data available under the license agreement for research.

Use of Unconstrained weights for Constrained training:
Using weights from an Unconstrained training model to initialize weights for Constrained training models is not permitted.

Use of OpenASR21 data other than the target language for training:
A reminder for those who registered for more than one language.:Any data received for languages besides the one being processed is not considered publicly available, and thus is not permitted under any of the training conditions. For example, if you registered for languages A and B, you may not use any data received for language B for training for your submissions for language A, under any training condition.

Registration updates:
If you decided to withdraw your participation for any languages you registered for, we ask that you let us know at openasr_poc@nist.gov.