The Automated Negotiating Agent Competition (ANAC) is an international tournament that has been running since 2010 to bring together researchers from the negotiation community. ANAC provides a unique benchmark for evaluating practical negotiation strategies in multi-issue domains and has the following aims: to provide an incentive for the development of effective and efficient negotiation protocols and strategies for bidding, accepting and opponent modeling for different negotiation scenarios;to collect and develop a benchmark of negotiation scenarios, protocols and strategies; to develop a common set of tools and criteria for the evaluation and exploration of new protocols and new strategies against benchmark scenarios, protocols and strategies;to set the research agenda for automated negotiation.
The previous competitions have spawned novel research in AI in the field of autonomous agent design which are available to the wider research community.
This year, we introduce five different negotiation research challenges: Agent Negotiation and Elicitation (GeniusWeb framework), Human-Agent Negotiation (IAGO framework), Werewolf Game (AIWolf Framework), Supply Chain Management (NegMas framework), HUMAINE (HUman Multi-Agent Immersive NEgotiation).
We expect innovative and novel agent strategies will be developed, and the submitted ANAC 2020 agents will serve as a negotiating agent repository to the negotiation community. The researchers can develop novel negotiating agents and evaluate their agents by comparing their performance with the performance of the ANAC 2020 agents.
January 15th, 5:00pm-9:20pm (JST)
The competition webpage has been released at http://web.tuat.ac.jp/~katfuji/ANAC2020/
IQ test competition
The automated IQ test competition is a test that contains three major categories of IQ test including verbal comprehension, diagram reasoning and sequence reasoning. All questions are collected manually from genuine real IQ tests for human-beings.
Participants are required to develop AI programs to solve these problems automatically, providing some given dataset.
Human players are also encouraged to participate the test as well.
January 13th, 9:00am-5:00pm JST
The competition webpage has been released at https://www.iqtest.pub/
In this competition, your task is to develop an intelligent Mahjong agent that is able to compete with other agents as well as human players on the online AI platform, Botzone. We adopt Mahjong Competition Rules (MCR) in this challenge. We provide a sample program for Mahjong beginers. You are also provided with our judge program which can help you learn the rules of MCR and debug your AI. The final winner after two formal rounds will be the champion of the competition.
January 12th-13th, 5:00pm-10:00pm (JST)
The competition webpage has been released at https://botzone.org.cn/static/gamecontest2020a.html