Skip to content
Snippets Groups Projects
Select Git revision
  • main default protected
  • gentus-public
2 results

gentus-public

  • Clone with SSH
  • Clone with HTTPS
  • user avatar
    linh authored
    f3544c86
    History

    GenTUS: Simulating User Behaviour and Language in Task-oriented Dialogues with Generative Transformers

    GenTUS is a user simulator for task-oriented dialogues, which consists of an encoder-decoder structure and is able to optimise both the user policy and natural language generation jointly. GenTUS generates both semantic actions and natural language utterances, preserving interpretability and enhancing language variation. In addition, by representing the inputs and outputs as word sequences and by using a large pre-trained language model we can achieve generalisability in feature representation.

    Training

    Building dataset

    python3 convlab2/policy/genTUS/build_data.py --add-history

    Train the model

    python3 convlab2/policy/genTUS/train_model.py --batch-size 8

    Citing

    If you use GenTUS in your research, please cite:

    @inproceedings{lin-etal-2022-gentus,
        title = "{G}en{TUS}: Simulating User Behaviour and Language in Task-oriented Dialogues with Generative Transformers",
        author = "Lin, Hsien-chin  and
          Geishauser, Christian  and
          Feng, Shutong  and
          Lubis, Nurul  and
          van Niekerk, Carel  and
          Heck, Michael  and
          Gasic, Milica",
        booktitle = "Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue",
        month = sep,
        year = "2022",
        address = "Edinburgh, UK",
        publisher = "Association for Computational Linguistics",
        url = "https://aclanthology.org/2022.sigdial-1.28",
        pages = "270--282",
        abstract = "User simulators (USs) are commonly used to train task-oriented dialogue systems via reinforcement learning. The interactions often take place on semantic level for efficiency, but there is still a gap from semantic actions to natural language, which causes a mismatch between training and deployment environment. Incorporating a natural language generation (NLG) module with USs during training can partly deal with this problem. However, since the policy and NLG of USs are optimised separately, these simulated user utterances may not be natural enough in a given context. In this work, we propose a generative transformer-based user simulator (GenTUS). GenTUS consists of an encoder-decoder structure, which means it can optimise both the user policy and natural language generation jointly. GenTUS generates both semantic actions and natural language utterances, preserving interpretability and enhancing language variation. In addition, by representing the inputs and outputs as word sequences and by using a large pre-trained language model we can achieve generalisability in feature representation. We evaluate GenTUS with automatic metrics and human evaluation. Our results show that GenTUS generates more natural language and is able to transfer to an unseen ontology in a zero-shot fashion. In addition, its behaviour can be further shaped with reinforcement learning opening the door to training specialised user simulators.",
    }
    
    @inproceedings{zhu2020convlab2,
        title={ConvLab-2: An Open-Source Toolkit for Building, Evaluating, and Diagnosing Dialogue Systems},
        author={Qi Zhu and Zheng Zhang and Yan Fang and Xiang Li and Ryuichi Takanobu and Jinchao Li and Baolin Peng and Jianfeng Gao and Xiaoyan Zhu and Minlie Huang},
        year={2020},
        booktitle={Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
    }
    
    @inproceedings{liu2021robustness,
        title={Robustness Testing of Language Understanding in Task-Oriented Dialog},
        author={Liu, Jiexi and Takanobu, Ryuichi and Wen, Jiaxin and Wan, Dazhen and Li, Hongguang and Nie, Weiran and Li, Cheng and Peng, Wei and Huang, Minlie},
        year={2021},
        booktitle={Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
    }

    License

    Apache License 2.0