Skip to main content

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

News

Registration for OpenCCU is now open!

DARPA
Computational Cultural Understanding Open Evaluation
(OpenCCU)


Contact
For more information about the evaluation (e.g., data, scoring tool, etc.), contact nist_ccu@nist.gov
Overview
In an effort to build critical mass to solve challenges posed in the Defense Advanced Research Projects Agency (DARPA) Computational Cultural Understanding (CCU) program, the National Institute of Standards and Technology (NIST) is organizing a smaller scale evaluation as part of an open track in TREC’2024 for researchers outside of the CCU program who want to participate in a particular technology development in CCU. This first evaluation track focuses on the detection of sociocultural norms in video recordings of a variety of interactions between two or more people in Mandarin Chinese. Successful communication entails not only knowing the local language but also understanding the local cultures and customs. Violation of cultural norms may derail a conversation and lead to disastrous consequences. As such, detecting social norms and determining if a speaker is adhering or violating them are a foundational component in dialogue assistance applications to facilitate successful communication between individuals who do not speak a common language and are not familiar with each other’s culture. Successful development of this capability is a component in one of the technical areas that the DARPA CCU program seeks to develop to provide effective dialogue assistance to monolingual operators in cross-cultural interactions. For more information about the evaluation task, data, metric, protocol, and schedule, refer to the evaluation plan posted on the evaluation page: https://www.nist.gov/itl/iad/mig/computational-cultural-understanding-open-evaluation-openccu.

For more information, contact nist_ccu@nist.gov.
Schedule

Milestone Date
Registration starts;
Dev available for release
February 13, 2024
Registration ends June 25, 2024
Pilot data release July 2, 2024
Pilot period July 9-16, 2024
Pilot full results at submission time
Pilot annotation release July 23, 2024
Eval data release August 27, 2024
Eval period September 3-10, 2024
Eval partial results at submission time
Eval full results September 17, 2024
Eval annotation release September 24, 2024
Participants’ papers due (not peer review) October 15, 2024
TREC registration TBD
TREC November 18-22, 2024
Participation
OpenCCU is open to the public worldwide. Anyone may register to participate, apply their technologies to the OpenCCU data, and submit their system results to the OpenCCU evaluation server. Participation is free.

Participation Logistics
  • Register for an evaluation account (if you don’t have one already). This account allows you to make submissions, check the submission status, and receive the results.

  • Sign the participation and data usage agreement. The agreement indicates your acknowledgment to abide by the data usage terms as well as the evaluation rules set forth in the evaluation plan.

  • Once the agreement is approved, you will be contacted with instructions on how to access the data.

  • Follow the instructions in the evaluation plan on how to format and package your submissions.

Registration Instructions
If you already have an account, login here or at the top of the page. To create an account and register, follow the steps below.
To Create an Account:
1- Click the "Sign up" button at the top of the page.
2- Provide a valid e-mail address and create a password in the “Sign up” form. (After entering your email address and creating a password and clicking “Sign up” on the “Sign up” online form, a confirmation email will be sent to that email address).
3- Click “Confirm my account” in the e-mail sent to you. (A sign in page will display with your email address and created password already entered.)
4- Click “Sign in”. (A dashboard for your account will display with Registration Steps.)
5- Complete the steps in the dashboard to complete an account creation.
6- Registration is completed when steps 1-5 are completed.
When you are notified by email from NIST that your License Agreement is approved, you can then access the data.
Creating a Team or Joining a Site and Team
When joining OpenCCU, a participant can either create a Site, or join an existing Site and create a Team, or join an existing Team. A participant can be a member of multiple teams.
Each participant, site, and team will have its own password. The creator(s) of the site and team creates those passwords respectively.
The Data License Agreement
Each site creator is required to agree to the terms of use as outlined in the Linguistic Data Consortium (LDC) license agreement in order to get access to the data. Download the license, read/sign the agreement, and upload the signed agreement. The LDC will review each license and will contact the site creator with instructions on how to get the data.
The Dashboard
The dashboard is the personalized page for each participant. To access the dashboard at any time, click the "Dashboard" at the top right of the screen. This is where you can make submissions and view results.

Submission Instructions
Each submission must be associated with a site, team and task.
The portals of making submissions have already created.
Submit system output for validation checking or scoring following these steps:
1- Prepare for Submission
  • System output must be in the format described in the OPENCCU Evaluation Plan.
2- Go to Dashboard. (Select "Dashboard" on the top right of the page.)
  • In "Submission Management", click the Task that represents the system output.
  • On the “Submissions” page, click “Upload” for each dataset condition. The “New Submission” page displays.
    • Click the "Choose File" to upload corresponding submission.
    • Click "Submit".
    • A submission ID is automatically created for the submission.
    • The “Scoring” button on the “Submissions” page displays “submitted” until the scoring server completes scoring and then it changes to “Done”.
3- View Submission Results
  • To see a Scoring Run Report, click the “Scoring” button, after “submitted” changes to “Done” on the button.
  • To see information about the submission click the “View Submission” button.
FAQ
Coming soon