Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Future Blog Post

less than 1 minute read

Published:

This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false.

Blog Post number 4

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 3

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 2

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 1

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

portfolio

publications

Automatic Term Extraction in Technical Domain using Part-of-Speech and Common Word Features

Published in Association for Computing Machinery (ACM) Symposium DocEng, 2018

Extracting key terms from technical documents allows us to write effective documentation that is specific and clear, with minimum ambiguity and confusion caused by nearly synonymous but different terms. We present a method for Automatic Term Extraction (ATE) for the technical domain based on the use of part-of-speech features and common words information that has shown comparable or better results to the reported state of the art results.

Recommended citation: "Automatic Term Extraction in Technical Domain using Part-of-Speech and Common Word Features", ACM Symposium DocEng 2018, Nisha Simon and Vlado Keselj, August 2018, Article No.: 51, pp 1–4. https://doi.org/10.1145/3209280.3229100

A Natural Language Model for Generating PDDL

Published in The International Conference on Automated Planning and Scheduling (ICAPS) , 2021

The goal of this preliminary work is to predict the next completion in PDDL code, based on previous and surrounding text. The ability to generate PDDL code will be extremely useful to PDDL practitioners for the purpose of solving planning problems. It further opens the door to providing a source of inspiration for the modeller. The main contribution of our approach is a language model built using Recurrent Neural Networks (RNNs) that is trained on existing PDDL domains, which can be used to generate PDDL-like code

Recommended citation: "A Natural Language Model for Generating PDDL", Nisha Simon and Christian Muise, Proceedings of The International Conference on Automated Planning and Scheduling (ICAPS) 2021, KEPS Workshop, August 2021, pp 1–8. https://icaps21.icaps-conference.org/workshops/KEPS/Papers/KEPS_2021_paper_7.pdf

TattleTale: Storytelling with Planning and Large Language Models

Published in The International Conference on Automated Planning and Scheduling (ICAPS) , 2022

We explore how automated planning can be applied to Natural Language text generation in order to create narratives (stories) that are coherent and believable. Our work represents a key first step towards the novel application of planning technology to a neuro-symbolic approach for effective story generation.

Recommended citation: "TattleTale: Storytelling with Planning and Large Language Models", Proceedings of The International Conference on Automated Planning and Scheduling (ICAPS) 2022, SPARK Workshop, Nisha Simon and Christian Muise, June 2022, pp 1–8. https://icaps22.icaps-conference.org/workshops/SPARK/papers/spark2022_paper_2.pdf

Does Robin Hood Use a Lightsaber?: Automated Planning for Storytelling

Published in Twenty-Seventh AAAI/SIGAI Doctoral Consortium, Association for the Advancement of Artificial Intelligence (AAAI) , 2024

While recent work has shown that LLMs can successfully be used for narrative generation, they lack coherence and can be prone to repetition and stilted language. Automated Planning can therefore be combined with Natural Language text generation to create narratives (stories) that are logical, coherent, and believable.

Recommended citation: "Does Robin Hood Use a Lightsaber?: Automated Planning for Storytelling",Twenty-Seventh AAAI/SIGAI Doctoral Consortium, Association for the Advancement of Artificial Intelligence (AAAI), Nisha Simon, February 2024, pp 1–2. https://doi.org/10.1609/aaai.v38i21.30411

Want To Choose Your Own Adventure? Then First Make a Plan.

Published in The 37th Canadian Conference on Artificial Intelligence (Canadian AI), 2024

We combine the fields of Automated Planning and Text Generation to show how text-based interactive Choose Your Own Adventure (CYOA) stories can be created using LLMs in conjunction with Fully Observable Non-Deterministic (FOND) based Automated Planning.

Recommended citation: "Want To Choose Your Own Adventure? Then First Make a Plan.", The 37th Canadian Conference on Artificial Intelligence (Canadian AI) 2024, Nisha Simon and Christian Muise, May 2024, pp 1–6. https://caiac.pubpub.org/pub/d2ujhb4x

Large Language Models are Incoherent Storytellers.

Published in The 37th Canadian Conference on Artificial Intelligence (Canadian AI), 2024

By combining Automated Planning with Natural Language Text Generation by LLMs we can create logical, believable, and coherent stories that can be used in a wide variety of domains, for a large range of applications.

Recommended citation: "Large Language Models are Incoherent Storytellers.", The 37th Canadian Conference on Artificial Intelligence (Canadian AI) 2024, Nisha Simon, May 2024, pp 1–5. https://caiac.pubpub.org/pub/3w466klp

talks

Automatic Term Extraction in Technical Domain using Part-of-Speech and Common Word Features

Published:

Extracting key terms from technical documents allows us to write effective documentation that is specific and clear, with minimum ambiguity and confusion caused by nearly synonymous but different terms. For instance, in order to avoid confusion, the same object should not be referred to by two different names (e.g. “hydraulic oil filter”). In the modern world of commerce, clear terminology is the hallmark of successful RFPs (Requests for Proposal) and is therefore a key to the growth of competitive organizations. While Automatic Term Extraction (ATE) is a well-developed area of study, its applications in the technical domain have been sparse and constrained to certain narrow areas such as the biomedical research domain. We present a method for Automatic Term Extraction (ATE) for the technical domain based on the use of part-of-speech features and common words information.

A Natural Language Model for Generating PDDL

Published:

The goal of this preliminary work is to predict the next completion in PDDL code, based on previous and surrounding text. Generating valid PDDL code is a key component in creating robust planners. Thus, the ability to generate PDDL code will be extremely useful to PDDL practitioners for the purpose of solving planning problems. It further opens the door to providing a source of inspiration for the modeller. The main contribution of our approach is a language model built using Recurrent Neural Networks (RNNs) that is trained on existing PDDL domains, which can be used to generate PDDL-like code.

Does Robin Hood Use a Lightsaber?: Automated Planning for Storytelling

Published:

Automated Planning can therefore be combined with Natural Language text generation to create narratives (stories) that are logical, coherent, and believable. A planning model provides scaffolding to an LLM so that the LLM’s language generation is context-dependent, in order to allow users to create more coherent, logical, and believable stories in a variety of domains.

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.