Conversational Prompt Engineering
Prompts are how humans communicate with LLMs. Informative prompts are essential for guiding LLMs to produce the desired output. However, prompt engineering is often tedious and time-consuming, requiring significant expertise, limiting its widespread use. We propose Conversational Prompt Engineering...
Saved in:
Main Authors | , , , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
08.08.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Prompts are how humans communicate with LLMs. Informative prompts are
essential for guiding LLMs to produce the desired output. However, prompt
engineering is often tedious and time-consuming, requiring significant
expertise, limiting its widespread use. We propose Conversational Prompt
Engineering (CPE), a user-friendly tool that helps users create personalized
prompts for their specific tasks. CPE uses a chat model to briefly interact
with users, helping them articulate their output preferences and integrating
these into the prompt. The process includes two main stages: first, the model
uses user-provided unlabeled data to generate data-driven questions and utilize
user responses to shape the initial instruction. Then, the model shares the
outputs generated by the instruction and uses user feedback to further refine
the instruction and the outputs. The final result is a few-shot prompt, where
the outputs approved by the user serve as few-shot examples. A user study on
summarization tasks demonstrates the value of CPE in creating personalized,
high-performing prompts. The results suggest that the zero-shot prompt obtained
is comparable to its - much longer - few-shot counterpart, indicating
significant savings in scenarios involving repetitive tasks with large text
volumes. |
---|---|
AbstractList | Prompts are how humans communicate with LLMs. Informative prompts are
essential for guiding LLMs to produce the desired output. However, prompt
engineering is often tedious and time-consuming, requiring significant
expertise, limiting its widespread use. We propose Conversational Prompt
Engineering (CPE), a user-friendly tool that helps users create personalized
prompts for their specific tasks. CPE uses a chat model to briefly interact
with users, helping them articulate their output preferences and integrating
these into the prompt. The process includes two main stages: first, the model
uses user-provided unlabeled data to generate data-driven questions and utilize
user responses to shape the initial instruction. Then, the model shares the
outputs generated by the instruction and uses user feedback to further refine
the instruction and the outputs. The final result is a few-shot prompt, where
the outputs approved by the user serve as few-shot examples. A user study on
summarization tasks demonstrates the value of CPE in creating personalized,
high-performing prompts. The results suggest that the zero-shot prompt obtained
is comparable to its - much longer - few-shot counterpart, indicating
significant savings in scenarios involving repetitive tasks with large text
volumes. |
Author | Gretz, Shai Ein-Dor, Liat Halfon, Alon Toledo-Ronen, Orith Spector, Artem Dankin, Lena Katz, Yoav Slonim, Noam |
Author_xml | – sequence: 1 givenname: Liat surname: Ein-Dor fullname: Ein-Dor, Liat – sequence: 2 givenname: Orith surname: Toledo-Ronen fullname: Toledo-Ronen, Orith – sequence: 3 givenname: Artem surname: Spector fullname: Spector, Artem – sequence: 4 givenname: Shai surname: Gretz fullname: Gretz, Shai – sequence: 5 givenname: Lena surname: Dankin fullname: Dankin, Lena – sequence: 6 givenname: Alon surname: Halfon fullname: Halfon, Alon – sequence: 7 givenname: Yoav surname: Katz fullname: Katz, Yoav – sequence: 8 givenname: Noam surname: Slonim fullname: Slonim, Noam |
BackLink | https://doi.org/10.48550/arXiv.2408.04560$$DView paper in arXiv |
BookMark | eNrjYmDJy89LZWCQNDTQM7EwNTXQTyyqyCzTMzIxsNAzMDE1M-BkUHTOzytLLSpOLMnMz0vMUQgoys8tKFFwzUvPzEtNLcrMS-dhYE1LzClO5YXS3Azybq4hzh66YMPiC4oycxOLKuNBhsaDDTUmrAIAYj4tXw |
ContentType | Journal Article |
Copyright | http://creativecommons.org/licenses/by-nc-nd/4.0 |
Copyright_xml | – notice: http://creativecommons.org/licenses/by-nc-nd/4.0 |
DBID | AKY GOX |
DOI | 10.48550/arxiv.2408.04560 |
DatabaseName | arXiv Computer Science arXiv.org |
DatabaseTitleList | |
Database_xml | – sequence: 1 dbid: GOX name: arXiv.org url: http://arxiv.org/find sourceTypes: Open Access Repository |
DeliveryMethod | fulltext_linktorsrc |
ExternalDocumentID | 2408_04560 |
GroupedDBID | AKY GOX |
ID | FETCH-arxiv_primary_2408_045603 |
IEDL.DBID | GOX |
IngestDate | Sat Aug 10 12:10:23 EDT 2024 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | false |
IsScholarly | false |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-arxiv_primary_2408_045603 |
OpenAccessLink | https://arxiv.org/abs/2408.04560 |
ParticipantIDs | arxiv_primary_2408_04560 |
PublicationCentury | 2000 |
PublicationDate | 2024-08-08 |
PublicationDateYYYYMMDD | 2024-08-08 |
PublicationDate_xml | – month: 08 year: 2024 text: 2024-08-08 day: 08 |
PublicationDecade | 2020 |
PublicationYear | 2024 |
Score | 3.867717 |
SecondaryResourceType | preprint |
Snippet | Prompts are how humans communicate with LLMs. Informative prompts are
essential for guiding LLMs to produce the desired output. However, prompt
engineering is... |
SourceID | arxiv |
SourceType | Open Access Repository |
SubjectTerms | Computer Science - Computation and Language |
Title | Conversational Prompt Engineering |
URI | https://arxiv.org/abs/2408.04560 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwY2BQsQSKmlsamgBbbpYWuiaWaca6FokmRropaeaJiWnACsEoDTQ04Otn5hFq4hVhGsHEoADbC5NYVJFZBjkfOKlYH3T-lh6o0QHslDMbGYGWbLn7R0AmJ8FHcUHVI9QB25hgIaRKwk2QgR_aulNwhESHEANTap4Ig6IzaGV3UTF02E0hoAiYB0sUkE4CFGWQd3MNcfbQBRsaXwA5ASIeZF882D5jMQYWYD89VYJBwdQwNcUyFVjFpySC7vAA9mRSUk2B_NRES6NkY6M0SQYJXKZI4ZaSZuAyAtaj4DVnFjIMLCVFpamywHqwJEkOHBgAQ0BhMg |
link.rule.ids | 228,230,783,888 |
linkProvider | Cornell University |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Conversational+Prompt+Engineering&rft.au=Ein-Dor%2C+Liat&rft.au=Toledo-Ronen%2C+Orith&rft.au=Spector%2C+Artem&rft.au=Gretz%2C+Shai&rft.date=2024-08-08&rft_id=info:doi/10.48550%2Farxiv.2408.04560&rft.externalDocID=2408_04560 |