深度自然语言建模张量积生成网络
Tensor Product Generation Networks for Deep NLP Modeling
吴大鹏   Dapeng Wu
报告人照片   Dapeng Wu
美国佛罗里达大学教授,IEEE Fellow,IEEE网络科学与工程学报主编
  In this talk, I present a new approach to the design of deep networks for natural language processing (NLP), based on the general technique of Tensor Product Representations (TPRs) for encoding and processing symbol structures in distributed neural networks. A network architecture--- the Tensor Product Generation Network (TPGN) --- is proposed which is capable in principle of carrying out TPR computation, but which uses unconstrained deep learning to design its internal representations. Instantiated in a model for image-caption generation, TPGN outperforms LSTM baselines when evaluated on the COCO dataset. The TPR-capable structure enables interpretation of internal representations and operations, which prove to contain considerable grammatical content. Our caption-generation model can be interpreted as generating sequences of grammatical categories and retrieving words by their categories from a plan encoded as a distributed representation.
报告时间:2018年08月13日10时00分    报告地点:西区科技西楼1213会议室
报名截止日期:2018年08月13日    可选人数:50