Skip to main content
News

Registration for OpenCCU is now open!

IARPA
Open Automatic Speech Recognition Challenge
(OpenASR20)


Contact Us
For any information about the OpenASR20 Evaluation (data, evaluation code, etc.) please email: openasr_poc@nist.gov
Welcome to the IARPA Open Automatic Speech Recognition Challenge (OpenASR20)
The goal of the OpenASR20 is to assess the state of the art of ASR technologies for low-resource languages. The OpenASR20 Challenge is an open challenge created out of the IARPA (Intelligence Advanced Research Projects Activity) MATERIAL (Machine Translation for English Retrieval of Information in Any Language) program that encompasses more tasks, including CLIR (cross-language information retrieval), domain classification, and summarization. For every year of MATERIAL, NIST supports a simplified, smaller scale evaluation open to all, focusing on a particular technology aspect of MATERIAL. CLIR technologies were the focus of the first open challenge in 2019, OpenCLIR. In 2020, the focus is on ASR. The capabilities tested in the open challenges are expected to ultimately support the MATERIAL task of effective triage and analysis of large volumes of data, in a variety of less-studied languages.
Please visit the OpenASR20 page for the current version of the evaluation plan, results, as well as other resources and tools.
Tentative Schedule

Milestone Date
Evaluation plan release July, 2020
Registration period August (date TBD) - October 15, 2020
Development period August – November 2020 (potentially longer)
     Train and Dev datasets release August 2020
     Scoring server accepts submissions for Dev datasets August – November 2020 (potentially longer)
Evaluation period November 3-10, 2020
     Eval datasets release November 3, 2020
     Scoring server accepts submissions for Eval datasets November 4-10, 2020
     System output due to NIST November 10, 2020
     System description due to NIST November 23, 2020
Participation
OpenASR20 is open to the public worldwide. Anyone may register to participate, apply their technologies to the OpenASR20 data, and submit their system results to the OpenASR evaluation server. Participation is free.

Participation Logistics
Each participant must create an account on this web platform. After creating an account, each participant will either create a new Site/Team or join an existing Site/Team.

After registering and having the data usage agreement approved, participants will be able to participate in the OpenASR20 Evaluation and receive instructions on how to obtain Train, Dev, and Eval datasets.

Participants will submit tar.gz files of their system’s output to the NIST OpenASR scoring server using this web site.

Go to the Registration Instructions tab for registration instructions and to register for OpenASR20.

Registration Instructions
If you already have an account, login here or at the top of the page. To create an account and register, follow the steps below.
To Create an Account:
1- Click the "Sign up" button at the top of the page.
2- Provide a valid e-mail address and create a password in the “Sign up” form. (After entering your email address and creating a password and clicking “Sign up” on the “OpenASR20 Sign up” online form, a confirmation email will be sent to that email address).
3- Click “Confirm my account” in the e-mail sent to you. (A sign in page will display with your email address and created password already entered.)
4- Click “Sign in”. (A dashboard for your account will display with Registration Steps.)
5- Complete the steps in the dashboard to complete an account creation.
6- Registration is completed when steps 1-5 are completed.
When you are notified by email from NIST that your License Agreement is approved, you can then access the data.
Creating a Team or Joining a Site and Team
When joining OpenASR, a participant can either create a Site, or join an existing Site and create a Team, or join an existing Team. A participant can be a member of multiple teams.
Each participant, Site, and Team will have its own password. The creator(s) of the Site and Team creates those passwords respectively.
The Data License Agreement
Site creator is required to agree to the NIST terms in order to access data for that site. Read the NIST license agreement and accept the terms by uploading the signed license agreement form. Participants cannot get data access until NIST approves your uploaded signed NIST license agreement.
The Dashboard
The dashboard is the personalized page for each participant. To access the dashboard at any time, click the "Dashboard" at the top right of the screen. This is where you can make submissions and view results.

System Output Submission Instructions
Each submission must be associated with a Site, Team and Task.
The portals of making submissions have already created.
Submit system output for validation checking or scoring following these steps:
1- Prepare for Submission
  • System output must be in the format described in the OPENASR Evaluation Plan.
2- Go to Dashboard. (Select "Dashboard" on the top right of the page.)
  • In "Submission Management", click the Task that represents the system output.
  • On the “Submissions” page, click “Upload” for each dataset X training condition combination. The “New Submission” page displays.
    • Click the "Choose File". Choose "Language" to upload corresponding submission.
    • Click "Submit".
    • A submission ID is automatically created for the submission.
    • The “Scoring” button on the “Submissions” page displays “submitted” until the scoring server completes scoring and then it changes to “Done”.
3- View Submission Results
  • To see a Scoring Run Report, click the “Scoring” button, after “submitted” changes to “Done” on the button.
  • To see information about the submission click the “View Submission” button.

The NIST scoring server will perform a validation check on each system output submission for conforming to the submission format required for each task.
Submission of a system description conforming to the system description guidelines in the OPENASR Evaluation Plan is required before receiving the system’s score and ranking results in the Evaluation phase.

Tools