The T5 model is trained on various datasets for 17 different tasks which fall into 8 categories.
Task Name | Explanation |
---|---|
1.CoLA | Classify if a sentence is gramaticaly correct |
2.RTE | Classify whether if a statement can be deducted from a sentence |
3.MNLI | Classify for a hypothesis and premise whether they contradict or contradict each other or neither of both (3 class). |
4.MRPC | Classify whether a pair of sentences is a re-phrasing of each other (semantically equivalent) |
5.QNLI | Classify whether the answer to a question can be deducted from an answer candidate. |
6.QQP | Classify whether a pair of questions is a re-phrasing of each other (semantically equivalent) |
7.SST2 | Classify the sentiment of a sentence as positive or negative |
8.STSB | Classify the sentiment of a sentence on a scale from 1 to 5 (21 Sentiment classes) |
9.CB | Classify for a premise and a hypothesis whether they contradict each other or not (binary). |
10.COPA | Classify for a question, premise, and 2 choices which choice the correct choice is (binary). |
11.MultiRc | Classify for a question, a paragraph of text, and an answer candidate, if the answer is correct (binary), |
12.WiC | Classify for a pair of sentences and a disambigous word if the word has the same meaning in both sentences. |
13.WSC/DPR | Predict for an ambiguous pronoun in a sentence what it is referring to. |
14.Summarization | Summarize text into a shorter representation. |
15.SQuAD | Answer a question for a given context. |
16.WMT1. | Translate English to German |
17.WMT2. | Translate English to French |
18.WMT3. | Translate English to Romanian |
The following tasks work fine without any additional pre-processing, only setting the task parameter
on the T5 model is required:
The following tasks require exactly 1 additional tag
added by manual pre-processing.
Set the task parameter
and then join the sentences on the tag
for these tasks.
The following tasks require more than 1 additional tag
added manual by pre-processing.
Set the task parameter
and then prefix sentences with their corresponding tags and join them for these tasks:
*
surrounding¶The task WSC/DPR requires highlighting a pronoun with *
and configuring a task parameter
.
The following sections describe each task in detail, with an example and also a pre-processed example.
*NOTE:* Linebreaks are added to the pre-processed examples
in the following section. The T5 model also works with linebreaks, but it can hinder the performance and it is not recommended to intentionally add them.
Judges if a sentence is grammatically acceptable.
This is a sub-task of GLUE.
sentence | prediction |
---|---|
Anna and Mike is going skiing and they is liked is | unacceptable |
Anna and Mike like to dance | acceptable |
.setTask(cola sentence:)
prefix.
cola
sentence: Anna and Mike is going skiing and they is liked is
The RTE task is defined as recognizing, given two text fragments, whether the meaning of one text can be inferred (entailed) from the other or not.
Classification of sentence pairs as entailed and not_entailed
This is a sub-task of GLUE and SuperGLUE.
sentence 1 | sentence 2 | prediction |
---|---|---|
Kessler ’s team conducted 60,643 interviews with adults in 14 countries. | Kessler ’s team interviewed more than 60,000 adults in 14 countries | entailed Peter loves New York, it is his favorite city| Peter loves new York. | entailed Recent report say Johnny makes he alot of money, he earned 10 million USD each year for the last 5 years. |Johnny is a millionare | entailment| Recent report say Johnny makes he alot of money, he earned 10 million USD each year for the last 5 years. |Johnny is a poor man | not_entailment | | It was raining in England for the last 4 weeks | England was very dry yesterday | not_entailment|
.setTask('rte sentence1:)
and prefix second sentence with sentence2:
rte
sentence1: Recent report say Peter makes he alot of money, he earned 10 million USD each year for the last 5 years.
sentence2: Peter is a millionare.
Classification of sentence pairs with the labels entailment
, contradiction
, and neutral
.
This is a sub-task of GLUE.
This classifier predicts for two sentences :
Hypothesis | Premise | prediction |
---|---|---|
Recent report say Johnny makes he alot of money, he earned 10 million USD each year for the last 5 years. | Johnny is a poor man. | contradiction |
It rained in England the last 4 weeks. | It was snowing in New York last week | neutral |
.setTask('mnli hypothesis:)
and prefix second sentence with premise:
mnli
hypothesis: At 8:34, the Boston Center controller received a third, transmission from American 11.
premise: The Boston Center controller got a third transmission from American 11.
Detect whether one sentence is a re-phrasing or similar to another sentence
This is a sub-task of GLUE.
Sentence1 | Sentence2 | prediction |
---|---|---|
We acted because we saw the existing evidence in a new light , through the prism of our experience on 11 September , " Rumsfeld said . | Rather , the US acted because the administration saw " existing evidence in a new light , through the prism of our experience on September 11 " . | equivalent |
I like to eat peanutbutter for breakfast | I like to play football | not_equivalent |
.setTask('mrpc sentence1:)
and prefix second sentence with sentence2:
mrpc
sentence1: We acted because we saw the existing evidence in a new light , through the prism of our experience on 11 September , " Rumsfeld said .
sentence2: Rather , the US acted because the administration saw " existing evidence in a new light , through the prism of our experience on September 11",
ISSUE: Can only get neutral and contradiction as prediction results for tested samples but no entailment predictions.
Classify whether a question is answered by a sentence (entailed
).
This is a sub-task of GLUE.
Question | Answer | prediction |
---|---|---|
Where did Jebe die? | Ghenkis Khan recalled Subtai back to Mongolia soon afterward, and Jebe died on the road back to Samarkand | entailment |
What does Steve like to eat? | Steve watches TV all day | not_netailment |
.setTask('QNLI sentence1:)
and prefix question with question:
sentence with sentence:
:
qnli
question: Where did Jebe die?
sentence: Ghenkis Khan recalled Subtai back to Mongolia soon afterwards, and Jebe died on the road back to Samarkand,
Based on a quora dataset, determine whether a pair of questions are semantically equivalent.
This is a sub-task of GLUE.
Question1 | Question2 | prediction |
---|---|---|
What attributes would have made you highly desirable in ancient Rome? | How I GET OPPERTINUTY TO JOIN IT COMPANY AS A FRESHER? | not_duplicate |
What was it like in Ancient rome? | What was Ancient rome like? | duplicate |
.setTask('qqp question1:) and prefix second sentence with question2:
qqp
question1: What attributes would have made you highly desirable in ancient Rome?
question2: How I GET OPPERTINUTY TO JOIN IT COMPANY AS A FRESHER?',
Binary sentiment classification.
This is a sub-task of GLUE.
Sentence1 | Prediction |
---|---|
it confirms fincher ’s status as a film maker who artfully bends technical know-how to the service of psychological insight | positive |
I really hated that movie | negative |
.setTask('sst2 sentence: ')
sst2
sentence: I hated that movie
Measures how similar two sentences are on a scale from 0 to 5 with 21 classes representing a regressive label.
This is a sub-task of GLUE.
Question1 | Question2 | prediction |
---|---|---|
What attributes would have made you highly desirable in ancient Rome? | How I GET OPPERTINUTY TO JOIN IT COMPANY AS A FRESHER? | 0 |
What was it like in Ancient rome? | What was Ancient rome like? | 5.0 |
What was live like as a King in Ancient Rome?? | What is it like to live in Rome? | 3.2 |
.setTask('stsb sentence1:)
and prefix second sentence with sentence2:
stsb
sentence1: What attributes would have made you highly desirable in ancient Rome?
sentence2: How I GET OPPERTINUTY TO JOIN IT COMPANY AS A FRESHER?',
Classify whether a Premise contradicts a Hypothesis.
Predicts entailment, neutral and contradiction
This is a sub-task of SuperGLUE.
Hypothesis | Premise | Prediction |
---|---|---|
Valence was helping | Valence the void-brain, Valence the virtuous valet. Why couldn’t the figger choose his own portion of titanic anatomy to shaft? Did he think he was helping' | Contradiction |
.setTask('cb hypothesis:)
and prefix premise with premise:
cb
hypothesis: Valence was helping
premise: Valence the void-brain, Valence the virtuous valet. Why couldn’t the figger choose his own portion of titanic anatomy to shaft? Did he think he was helping,
The Choice of Plausible Alternatives (COPA) task by Roemmele et al. (2011) evaluates
causal reasoning between events, which requires commonsense knowledge about what usually takes
place in the world. Each example provides a premise and either asks for the correct cause or effect
from two choices, thus testing either backward
or forward causal reasoning
. COPA data, which
consists of 1,000 examples total, can be downloaded at https://people.ict.usc.e
This is a sub-task of SuperGLUE.
This classifier selects from a choice of 2 options
which one the correct is based on a premise
.
Premise: The man lost his balance on the ladder.
question: What happened as a result?
Alternative 1: He fell off the ladder.
Alternative 2: He climbed up the ladder.
Premise: The man fell unconscious. What was the cause
of this?
Alternative 1: The assailant struck the man in the head.
Alternative 2: The assailant took the man’s wallet.
Question | Premise | Choice 1 | Choice 2 | Prediction |
---|---|---|---|---|
effect | Politcal Violence broke out in the nation. | many citizens relocated to the capitol. | Many citizens took refuge in other territories | Choice 1 |
correct | The men fell unconscious | The assailant struckl the man in the head | he assailant s took the man's wallet. | choice1 |
.setTask('copa choice1:)
, prefix choice2 with choice2:
, prefix premise with premise:
and prefix the question with question
copa
choice1: He fell off the ladder
choice2: He climbed up the lader
premise: The man lost his balance on the ladder
question: effect
Evaluates an answer
for a question
as true
or false
based on an input paragraph
The T5 model predicts for a question
and a paragraph
of sentences
wether an answer
is true or not,
based on the semantic contents of the paragraph.
This is a sub-task of SuperGLUE.
Exeeds human performance by a large margin
Question | Answer | Prediction | paragraph |
---|---|---|---|
Why was Joey surprised the morning he woke up for breakfast? | There was only pie to eat, rather than traditional breakfast foods | True | Once upon a time, there was a squirrel named Joey. Joey loved to go outside and play with his cousin Jimmy. Joey and Jimmy played silly games together, and were always laughing. One day, Joey and Jimmy went swimming together 50 at their Aunt Julie’s pond. Joey woke up early in the morning to eat some food before they left. He couldn’t find anything to eat except for pie! Usually, Joey would eat cereal, fruit (a pear), or oatmeal for breakfast. After he ate, he and Jimmy went to the pond. On their way there they saw their friend Jack Rabbit. They dove into the water and swam for several hours. The sun was out, but the breeze was cold. Joey and Jimmy got out of the water and started walking home. Their fur was wet, and the breeze chilled them. When they got home, they dried off, and Jimmy put on his favorite purple shirt. Joey put on a blue shirt with red and green dots. The two squirrels ate some food that Joey’s mom, Jasmine, made and went off to bed., |
Why was Joey surprised the morning he woke up for breakfast? | There was a T-Rex in his garden | False | Once upon a time, there was a squirrel named Joey. Joey loved to go outside and play with his cousin Jimmy. Joey and Jimmy played silly games together, and were always laughing. One day, Joey and Jimmy went swimming together 50 at their Aunt Julie’s pond. Joey woke up early in the morning to eat some food before they left. He couldn’t find anything to eat except for pie! Usually, Joey would eat cereal, fruit (a pear), or oatmeal for breakfast. After he ate, he and Jimmy went to the pond. On their way there they saw their friend Jack Rabbit. They dove into the water and swam for several hours. The sun was out, but the breeze was cold. Joey and Jimmy got out of the water and started walking home. Their fur was wet, and the breeze chilled them. When they got home, they dried off, and Jimmy put on his favorite purple shirt. Joey put on a blue shirt with red and green dots. The two squirrels ate some food that Joey’s mom, Jasmine, made and went off to bed., |
.setTask('multirc questions:)
followed by answer:
prefix for the answer to evaluate, followed by paragraph:
and then a series of sentences, where each sentence is prefixed with Sent n:
prefix second sentence with sentence2:
multirc questions: Why was Joey surprised the morning he woke up for breakfast?
answer: There was a T-REX in his garden.
paragraph:
Sent 1: Once upon a time, there was a squirrel named Joey.
Sent 2: Joey loved to go outside and play with his cousin Jimmy.
Sent 3: Joey and Jimmy played silly games together, and were always laughing.
Sent 4: One day, Joey and Jimmy went swimming together 50 at their Aunt Julie’s pond.
Sent 5: Joey woke up early in the morning to eat some food before they left.
Sent 6: He couldn’t find anything to eat except for pie!
Sent 7: Usually, Joey would eat cereal, fruit (a pear), or oatmeal for breakfast.
Sent 8: After he ate, he and Jimmy went to the pond.
Sent 9: On their way there they saw their friend Jack Rabbit.
Sent 10: They dove into the water and swam for several hours.
Sent 11: The sun was out, but the breeze was cold.
Sent 12: Joey and Jimmy got out of the water and started walking home.
Sent 13: Their fur was wet, and the breeze chilled them.
Sent 14: When they got home, they dried off, and Jimmy put on his favorite purple shirt.
Sent 15: Joey put on a blue shirt with red and green dots.
Sent 16: The two squirrels ate some food that Joey’s mom, Jasmine, made and went off to bed.
Decide for two sentence
s with a shared disambigous word
wether they have the target word has the same semantic meaning
in both sentences.
This is a sub-task of SuperGLUE.
Predicted | disambigous word | Sentence 1 | Sentence 2 |
---|---|---|---|
False | kill | He totally killed that rock show! | The airplane crash killed his family |
True | window | The expanded window will give us time to catch the thieves. | You have a two-hour window for turning in your homework. |
False | window | He jumped out of the window. | You have a two-hour window for turning in your homework. |
.setTask('wic pos:)
followed by sentence1:
prefix for the first sentence, followed by sentence2:
prefix for the second sentence.
wic pos:
sentence1: The expanded window will give us time to catch the thieves.
sentence2: You have a two-hour window of turning in your homework.
word : window
Predict for an ambiguous pronoun
to which noun
it is referring to.
This is a sub-task of GLUE and SuperGLUE.
Prediction | Text |
---|---|
stable | The stable was very roomy, with four good stalls; a large swinging window opened into the yard , which made it pleasant and airy. |
.setTask('wsc:)
and surround pronoun with asteriks symbols..
The ambiguous pronous
should be surrounded with *
symbols.
*Note* Read Appendix A. for more info
wsc:
The stable was very roomy, with four good stalls; a large swinging window opened into the yard , which made *it* pleasant and airy.
Summarizes
a paragraph into a shorter version with the same semantic meaning.
Predicted summary | Text |
---|---|
manchester united face newcastle in the premier league on wednesday . louis van gaal's side currently sit two points clear of liverpool in fourth . the belgian duo took to the dance floor on monday night with some friends . | the belgian duo took to the dance floor on monday night with some friends . manchester united face newcastle in the premier league on wednesday . red devils will be looking for just their second league away win in seven . louis van gaal’s side currently sit two points clear of liverpool in fourth . |
.setTask('summarize:)
This task requires no pre-processing, setting the task to summarize
is sufficient.
the belgian duo took to the dance floor on monday night with some friends . manchester united face newcastle in the premier league on wednesday . red devils will be looking for just their second league away win in seven . louis van gaal’s side currently sit two points clear of liverpool in fourth .
Predict an answer
to a question
based on input context
.
Predicted Answer | Question | Context |
---|---|---|
carbon monoxide | What does increased oxygen concentrations in the patient’s lungs displace? | Hyperbaric (high-pressure) medicine uses special oxygen chambers to increase the partial pressure of O 2 around the patient and, when needed, the medical staff. Carbon monoxide poisoning, gas gangrene, and decompression sickness (the ’bends’) are sometimes treated using these devices. Increased O 2 concentration in the lungs helps to displace carbon monoxide from the heme group of hemoglobin. Oxygen gas is poisonous to the anaerobic bacteria that cause gas gangrene, so increasing its partial pressure helps kill them. Decompression sickness occurs in divers who decompress too quickly after a dive, resulting in bubbles of inert gas, mostly nitrogen and helium, forming in their blood. Increasing the pressure of O 2 as soon as possible is part of the treatment. |
pie | What did Joey eat for breakfast? | Once upon a time, there was a squirrel named Joey. Joey loved to go outside and play with his cousin Jimmy. Joey and Jimmy played silly games together, and were always laughing. One day, Joey and Jimmy went swimming together 50 at their Aunt Julie’s pond. Joey woke up early in the morning to eat some food before they left. Usually, Joey would eat cereal, fruit (a pear), or oatmeal for breakfast. After he ate, he and Jimmy went to the pond. On their way there they saw their friend Jack Rabbit. They dove into the water and swam for several hours. The sun was out, but the breeze was cold. Joey and Jimmy got out of the water and started walking home. Their fur was wet, and the breeze chilled them. When they got home, they dried off, and Jimmy put on his favorite purple shirt. Joey put on a blue shirt with red and green dots. The two squirrels ate some food that Joey’s mom, Jasmine, made and went off to bed,' |
.setTask('question:)
and prefix the context which can be made up of multiple sentences with context:
question: What does increased oxygen concentrations in the patient’s lungs displace?
context: Hyperbaric (high-pressure) medicine uses special oxygen chambers to increase the partial pressure of O 2 around the patient and, when needed, the medical staff. Carbon monoxide poisoning, gas gangrene, and decompression sickness (the ’bends’) are sometimes treated using these devices. Increased O 2 concentration in the lungs helps to displace carbon monoxide from the heme group of hemoglobin. Oxygen gas is poisonous to the anaerobic bacteria that cause gas gangrene, so increasing its partial pressure helps kill them. Decompression sickness occurs in divers who decompress too quickly after a dive, resulting in bubbles of inert gas, mostly nitrogen and helium, forming in their blood. Increasing the pressure of O 2 as soon as possible is part of the treatment.
For translation tasks use the marian
model
.setTask('translate English to German:)
For translation tasks use the marian
model
.setTask('translate English to French:)
For translation tasks use the marian
model
.setTask('translate English to Romanian:)
import os
! apt-get update -qq > /dev/null
# Install java
! apt-get install -y openjdk-8-jdk-headless -qq > /dev/null
os.environ["JAVA_HOME"] = "/usr/lib/jvm/java-8-openjdk-amd64"
os.environ["PATH"] = os.environ["JAVA_HOME"] + "/bin:" + os.environ["PATH"]
! pip install nlu pyspark==2.4.7
import nlu
t5 = nlu.load('en.t5.base')
t5_base download started this may take some time. Approximate size to download 446 MB [OK!]
t5.print_info()
The following parameters are configurable for this NLU pipeline (You can copy paste the examples) : >>> pipe['t5'] has settable params: pipe['t5'].setMaxOutputLength(200) | Info: Set the maximum length of output text | Currently set to : 200 pipe['t5'].setTask('base') | Info: Transformer's task, e.g. summarize> | Currently set to : base >>> pipe['sentence_detector'] has settable params: pipe['sentence_detector'].setUseAbbreviations(True) | Info: whether to apply abbreviations at sentence detection | Currently set to : True pipe['sentence_detector'].setDetectLists(True) | Info: whether detect lists during sentence detection | Currently set to : True pipe['sentence_detector'].setUseCustomBoundsOnly(False) | Info: Only utilize custom bounds in sentence detection | Currently set to : False pipe['sentence_detector'].setCustomBounds([]) | Info: characters used to explicitly mark sentence bounds | Currently set to : [] pipe['sentence_detector'].setExplodeSentences(False) | Info: whether to explode each sentence into a different row, for better parallelization. Defaults to false. | Currently set to : False pipe['sentence_detector'].setMinLength(0) | Info: Set the minimum allowed length for each sentence. | Currently set to : 0 pipe['sentence_detector'].setMaxLength(99999) | Info: Set the maximum allowed length for each sentence | Currently set to : 99999 >>> pipe['default_tokenizer'] has settable params: pipe['default_tokenizer'].setTargetPattern('\S+') | Info: pattern to grab from text as token candidates. Defaults \S+ | Currently set to : \S+ pipe['default_tokenizer'].setContextChars(['.', ',', ';', ':', '!', '?', '*', '-', '(', ')', '"', "'"]) | Info: character list used to separate from token boundaries | Currently set to : ['.', ',', ';', ':', '!', '?', '*', '-', '(', ')', '"', "'"] pipe['default_tokenizer'].setCaseSensitiveExceptions(True) | Info: Whether to care for case sensitiveness in exceptions | Currently set to : True pipe['default_tokenizer'].setMinLength(0) | Info: Set the minimum allowed legth for each token | Currently set to : 0 pipe['default_tokenizer'].setMaxLength(99999) | Info: Set the maximum allowed legth for each token | Currently set to : 99999 >>> pipe['document_assembler'] has settable params: pipe['document_assembler'].setCleanupMode('shrink') | Info: possible values: disabled, inplace, inplace_full, shrink, shrink_full, each, each_full, delete_full | Currently set to : shrink
Judges if a sentence is grammatically acceptable.
This is a sub-task of GLUE.
sentence | prediction |
---|---|
Anna and Mike is going skiing and they is liked is | unacceptable |
Anna and Mike like to dance | acceptable |
.setTask(cola sentence:)
prefix.
cola
sentence: Anna and Mike is going skiing and they is liked is
# Set the task on T5
t5['t5'].setTask('cola sentence: ')
# define Data
data = ['Anna and Mike is going skiing and they is liked is','Anna and Mike like to dance']
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | unacceptable | Anna and Mike is going skiing and they is like... |
1 | acceptable | Anna and Mike like to dance |
The RTE task is defined as recognizing, given two text fragments, whether the meaning of one text can be inferred (entailed) from the other or not.
Classification of sentence pairs as entailed and not_entailed
This is a sub-task of GLUE and SuperGLUE.
sentence 1 | sentence 2 | prediction |
---|---|---|
Kessler ’s team conducted 60,643 interviews with adults in 14 countries. | Kessler ’s team interviewed more than 60,000 adults in 14 countries | entailed Peter loves New York, it is his favorite city| Peter loves new York. | entailed Recent report say Johnny makes he alot of money, he earned 10 million USD each year for the last 5 years. |Johnny is a millionare | entailment| Recent report say Johnny makes he alot of money, he earned 10 million USD each year for the last 5 years. |Johnny is a poor man | not_entailment | | It was raining in England for the last 4 weeks | England was very dry yesterday | not_entailment|
.setTask('rte sentence1:)
and prefix second sentence with sentence2:
rte
sentence1: Recent report say Peter makes he alot of money, he earned 10 million USD each year for the last 5 years.
sentence2: Peter is a millionare.
# Set the task on T5
t5['t5'].setTask('rte sentence: ')
data = [
'Recent report say Peter makes he alot of money, he earned 10 million USD each year for the last 5 years. sentence2: Peter is a millionare',
'Recent report say Peter makes he alot of money, he earned 10 million USD each year for the last 5 years. sentence2: Peter is a poor man']
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | entailment | Recent report say Peter makes he alot of money... |
1 | not_entailment | Recent report say Peter makes he alot of money... |
Classification of sentence pairs with the labels entailment
, contradiction
, and neutral
.
This is a sub-task of GLUE.
This classifier predicts for two sentences :
Hypothesis | Premise | prediction |
---|---|---|
Recent report say Johnny makes he alot of money, he earned 10 million USD each year for the last 5 years. | Johnny is a poor man. | contradiction |
It rained in England the last 4 weeks. | It was snowing in New York last week | neutral |
.setTask('mnli hypothesis:)
and prefix second sentence with premise:
mnli
hypothesis: At 8:34, the Boston Center controller received a third, transmission from American 11.
premise: The Boston Center controller got a third transmission from American 11.
# Set the task on T5
t5['t5'].setTask('mnli ')
# define Data, add additional tags between sentences
data = [
''' hypothesis: At 8:34, the Boston Center controller received a third, transmission from American 11.
premise: The Boston Center controller got a third transmission from American 11.
'''
,
'''
hypothesis: Recent report say Johnny makes he alot of money, he earned 10 million USD each year for the last 5 years.
premise: Johnny is a poor man.
'''
]
# Set the task on T5
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | neutral | hypothesis: At 8:34, the Boston Center control... |
1 | contradiction | hypothesis: Recent report say Johnny makes he ... |
Detect whether one sentence is a re-phrasing or similar to another sentence
This is a sub-task of GLUE.
Sentence1 | Sentence2 | prediction |
---|---|---|
We acted because we saw the existing evidence in a new light , through the prism of our experience on 11 September , " Rumsfeld said . | Rather , the US acted because the administration saw " existing evidence in a new light , through the prism of our experience on September 11 " . | equivalent |
I like to eat peanutbutter for breakfast | I like to play football | not_equivalent |
.setTask('mrpc sentence1:)
and prefix second sentence with sentence2:
mrpc
sentence1: We acted because we saw the existing evidence in a new light , through the prism of our experience on 11 September , " Rumsfeld said .
sentence2: Rather , the US acted because the administration saw " existing evidence in a new light , through the prism of our experience on September 11",
# Set the task on T5
t5['t5'].setTask('mrpc ')
# define Data, add additional tags between sentences
data = [
''' sentence1: We acted because we saw the existing evidence in a new light , through the prism of our experience on 11 September , " Rumsfeld said .
sentence2: Rather , the US acted because the administration saw " existing evidence in a new light , through the prism of our experience on September 11 "
'''
,
'''
sentence1: I like to eat peanutbutter for breakfast
sentence2: I like to play football.
'''
]
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | equivalent | sentence1: We acted because we saw the existin... |
1 | not_equivalent | sentence1: I like to eat peanutbutter for brea... |
Classify whether a question is answered by a sentence (entailed
).
This is a sub-task of GLUE.
Question | Answer | prediction |
---|---|---|
Where did Jebe die? | Ghenkis Khan recalled Subtai back to Mongolia soon afterward, and Jebe died on the road back to Samarkand | entailment |
What does Steve like to eat? | Steve watches TV all day | not_netailment |
.setTask('QNLI sentence1:)
and prefix question with question:
sentence with sentence:
:
qnli
question: Where did Jebe die?
sentence: Ghenkis Khan recalled Subtai back to Mongolia soon afterwards, and Jebe died on the road back to Samarkand,
# Set the task on T5
t5['t5'].setTask('QNLI ')
# define Data, add additional tags between sentences
data = [
''' question: Where did Jebe die?
sentence: Ghenkis Khan recalled Subtai back to Mongolia soon afterwards, and Jebe died on the road back to Samarkand,
'''
,
'''
question: What does Steve like to eat?
sentence: Steve watches TV all day
'''
]
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | entailment | question: Where did Jebe die? sentence: Ghenki... |
1 | not_entailment | question: What does Steve like to eat? sentenc... |
Based on a quora dataset, determine whether a pair of questions are semantically equivalent.
This is a sub-task of GLUE.
Question1 | Question2 | prediction |
---|---|---|
What attributes would have made you highly desirable in ancient Rome? | How I GET OPPERTINUTY TO JOIN IT COMPANY AS A FRESHER? | not_duplicate |
What was it like in Ancient rome? | What was Ancient rome like? | duplicate |
.setTask('qqp question1:) and prefix second sentence with question2:
qqp
question1: What attributes would have made you highly desirable in ancient Rome?
question2: How I GET OPPERTINUTY TO JOIN IT COMPANY AS A FRESHER?',
# Set the task on T5
t5['t5'].setTask('qqp ')
# define Data, add additional tags between sentences
data = [
''' question1: What attributes would have made you highly desirable in ancient Rome?
question2: How I GET OPPERTINUTY TO JOIN IT COMPANY AS A FRESHER?'
'''
,
'''
question1: What was it like in Ancient rome?
question2: What was Ancient rome like?
'''
]
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | not_duplicate | question1: What attributes would have made you... |
1 | duplicate | question1: What was it like in Ancient rome? q... |
Binary sentiment classification.
This is a sub-task of GLUE.
Sentence1 | Prediction |
---|---|
it confirms fincher ’s status as a film maker who artfully bends technical know-how to the service of psychological insight | positive |
I really hated that movie | negative |
.setTask('sst2 sentence: ')
sst2
sentence: I hated that movie
# Set the task on T5
t5['t5'].setTask('sst2 sentence: ')
# define Data, add additional tags between sentences
data = [
''' I really hated that movie''',
''' it confirms fincher ’s status as a film maker who artfully bends technical know-how to the service of psychological insight'''
]
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | negative | I really hated that movie |
1 | positive | it confirms fincher ’s status as a film maker ... |
Measures how similar two sentences are on a scale from 0 to 5 with 21 classes representing a regressive label.
This is a sub-task of GLUE.
Question1 | Question2 | prediction |
---|---|---|
What attributes would have made you highly desirable in ancient Rome? | How I GET OPPERTINUTY TO JOIN IT COMPANY AS A FRESHER? | 0 |
What was it like in Ancient rome? | What was Ancient rome like? | 5.0 |
What was live like as a King in Ancient Rome?? | What is it like to live in Rome? | 3.2 |
.setTask('stsb sentence1:)
and prefix second sentence with sentence2:
stsb
sentence1: What attributes would have made you highly desirable in ancient Rome?
sentence2: How I GET OPPERTINUTY TO JOIN IT COMPANY AS A FRESHER?',
# Set the task on T5
t5['t5'].setTask('stsb ')
# define Data, add additional tags between sentences
data = [
''' sentence1: What attributes would have made you highly desirable in ancient Rome?
sentence2: How I GET OPPERTINUTY TO JOIN IT COMPANY AS A FRESHER?'
'''
,
'''
sentence1: What was it like in Ancient rome?
sentence2: What was Ancient rome like?
''',
'''
sentence1: What was live like as a King in Ancient Rome??
sentence2: What was Ancient rome like?
'''
]
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | not_duplicate | sentence1: What attributes would have made you... |
1 | duplicate | sentence1: What was it like in Ancient rome? s... |
2 | not_duplicate | sentence1: What was live like as a King in Anc... |
Classify whether a Premise contradicts a Hypothesis.
Predicts entailment, neutral and contradiction
This is a sub-task of SuperGLUE.
Hypothesis | Premise | Prediction |
---|---|---|
Valence was helping | Valence the void-brain, Valence the virtuous valet. Why couldn’t the figger choose his own portion of titanic anatomy to shaft? Did he think he was helping' | Contradiction |
.setTask('cb hypothesis:)
and prefix premise with premise:
cb
hypothesis: Valence was helping
premise: Valence the void-brain, Valence the virtuous valet. Why couldn’t the figger choose his own portion of titanic anatomy to shaft? Did he think he was helping,
# Set the task on T5
t5['t5'].setTask('cb ')
# define Data, add additional tags between sentences
data = [
'''
hypothesis: Recent report say Johnny makes he alot of money, he earned 10 million USD each year for the last 5 years.
premise: Johnny is a poor man.
''']
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | contradiction | hypothesis: Recent report say Johnny makes he ... |
The Choice of Plausible Alternatives (COPA) task by Roemmele et al. (2011) evaluates
causal reasoning between events, which requires commonsense knowledge about what usually takes
place in the world. Each example provides a premise and either asks for the correct cause or effect
from two choices, thus testing either backward
or forward causal reasoning
. COPA data, which
consists of 1,000 examples total, can be downloaded at https://people.ict.usc.e
This is a sub-task of SuperGLUE.
This classifier selects from a choice of 2 options
which one the correct is based on a premise
.
Premise: The man lost his balance on the ladder.
question: What happened as a result?
Alternative 1: He fell off the ladder.
Alternative 2: He climbed up the ladder.
Premise: The man fell unconscious. What was the cause
of this?
Alternative 1: The assailant struck the man in the head.
Alternative 2: The assailant took the man’s wallet.
Question | Premise | Choice 1 | Choice 2 | Prediction |
---|---|---|---|---|
effect | Politcal Violence broke out in the nation. | many citizens relocated to the capitol. | Many citizens took refuge in other territories | Choice 1 |
correct | The men fell unconscious | The assailant struckl the man in the head | he assailant s took the man's wallet. | choice1 |
.setTask('copa choice1:)
, prefix choice2 with choice2:
, prefix premise with premise:
and prefix the question with question
copa
choice1: He fell off the ladder
choice2: He climbed up the lader
premise: The man lost his balance on the ladder
question: effect
# Set the task on T5
t5['t5'].setTask('copa ')
# define Data, add additional tags between sentences
data = [
'''
choice1: He fell off the ladder
choice2: He climbed up the lader
premise: The man lost his balance on the ladder
question: effect
''']
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | choice1 | choice1: He fell off the ladder choice2: He cl... |
Evaluates an answer
for a question
as true
or false
based on an input paragraph
The T5 model predicts for a question
and a paragraph
of sentences
wether an answer
is true or not,
based on the semantic contents of the paragraph.
This is a sub-task of SuperGLUE.
Exeeds human performance by a large margin
Question | Answer | Prediction | paragraph |
---|---|---|---|
Why was Joey surprised the morning he woke up for breakfast? | There was only pie to eat, rather than traditional breakfast foods | True | Once upon a time, there was a squirrel named Joey. Joey loved to go outside and play with his cousin Jimmy. Joey and Jimmy played silly games together, and were always laughing. One day, Joey and Jimmy went swimming together 50 at their Aunt Julie’s pond. Joey woke up early in the morning to eat some food before they left. He couldn’t find anything to eat except for pie! Usually, Joey would eat cereal, fruit (a pear), or oatmeal for breakfast. After he ate, he and Jimmy went to the pond. On their way there they saw their friend Jack Rabbit. They dove into the water and swam for several hours. The sun was out, but the breeze was cold. Joey and Jimmy got out of the water and started walking home. Their fur was wet, and the breeze chilled them. When they got home, they dried off, and Jimmy put on his favorite purple shirt. Joey put on a blue shirt with red and green dots. The two squirrels ate some food that Joey’s mom, Jasmine, made and went off to bed., |
Why was Joey surprised the morning he woke up for breakfast? | There was a T-Rex in his garden | False | Once upon a time, there was a squirrel named Joey. Joey loved to go outside and play with his cousin Jimmy. Joey and Jimmy played silly games together, and were always laughing. One day, Joey and Jimmy went swimming together 50 at their Aunt Julie’s pond. Joey woke up early in the morning to eat some food before they left. He couldn’t find anything to eat except for pie! Usually, Joey would eat cereal, fruit (a pear), or oatmeal for breakfast. After he ate, he and Jimmy went to the pond. On their way there they saw their friend Jack Rabbit. They dove into the water and swam for several hours. The sun was out, but the breeze was cold. Joey and Jimmy got out of the water and started walking home. Their fur was wet, and the breeze chilled them. When they got home, they dried off, and Jimmy put on his favorite purple shirt. Joey put on a blue shirt with red and green dots. The two squirrels ate some food that Joey’s mom, Jasmine, made and went off to bed., |
.setTask('multirc questions:)
followed by answer:
prefix for the answer to evaluate, followed by paragraph:
and then a series of sentences, where each sentence is prefixed with Sent n:
prefix second sentence with sentence2:
multirc questions: Why was Joey surprised the morning he woke up for breakfast?
answer: There was a T-REX in his garden.
paragraph:
Sent 1: Once upon a time, there was a squirrel named Joey.
Sent 2: Joey loved to go outside and play with his cousin Jimmy.
Sent 3: Joey and Jimmy played silly games together, and were always laughing.
Sent 4: One day, Joey and Jimmy went swimming together 50 at their Aunt Julie’s pond.
Sent 5: Joey woke up early in the morning to eat some food before they left.
Sent 6: He couldn’t find anything to eat except for pie!
Sent 7: Usually, Joey would eat cereal, fruit (a pear), or oatmeal for breakfast.
Sent 8: After he ate, he and Jimmy went to the pond.
Sent 9: On their way there they saw their friend Jack Rabbit.
Sent 10: They dove into the water and swam for several hours.
Sent 11: The sun was out, but the breeze was cold.
Sent 12: Joey and Jimmy got out of the water and started walking home.
Sent 13: Their fur was wet, and the breeze chilled them.
Sent 14: When they got home, they dried off, and Jimmy put on his favorite purple shirt.
Sent 15: Joey put on a blue shirt with red and green dots.
Sent 16: The two squirrels ate some food that Joey’s mom, Jasmine, made and went off to bed.
# Set the task on T5
t5['t5'].setTask('multirc ')
# define Data, add additional tags between sentences
data = [
'''
questions: Why was Joey surprised the morning he woke up for breakfast?
answer: There was a T-REX in his garden.
paragraph:
Sent 1: Once upon a time, there was a squirrel named Joey.
Sent 2: Joey loved to go outside and play with his cousin Jimmy.
Sent 3: Joey and Jimmy played silly games together, and were always laughing.
Sent 4: One day, Joey and Jimmy went swimming together 50 at their Aunt Julie’s pond.
Sent 5: Joey woke up early in the morning to eat some food before they left.
Sent 6: He couldn’t find anything to eat except for pie!
Sent 7: Usually, Joey would eat cereal, fruit (a pear), or oatmeal for breakfast.
Sent 8: After he ate, he and Jimmy went to the pond.
Sent 9: On their way there they saw their friend Jack Rabbit.
Sent 10: They dove into the water and swam for several hours.
Sent 11: The sun was out, but the breeze was cold.
Sent 12: Joey and Jimmy got out of the water and started walking home.
Sent 13: Their fur was wet, and the breeze chilled them.
Sent 14: When they got home, they dried off, and Jimmy put on his favorite purple shirt.
Sent 15: Joey put on a blue shirt with red and green dots.
Sent 16: The two squirrels ate some food that Joey’s mom, Jasmine, made and went off to bed.
''',
'''
questions: Why was Joey surprised the morning he woke up for breakfast?
answer: There was only pie for breakfast.
paragraph:
Sent 1: Once upon a time, there was a squirrel named Joey.
Sent 2: Joey loved to go outside and play with his cousin Jimmy.
Sent 3: Joey and Jimmy played silly games together, and were always laughing.
Sent 4: One day, Joey and Jimmy went swimming together 50 at their Aunt Julie’s pond.
Sent 5: Joey woke up early in the morning to eat some food before they left.
Sent 6: He couldn’t find anything to eat except for pie!
Sent 7: Usually, Joey would eat cereal, fruit (a pear), or oatmeal for breakfast.
Sent 8: After he ate, he and Jimmy went to the pond.
Sent 9: On their way there they saw their friend Jack Rabbit.
Sent 10: They dove into the water and swam for several hours.
Sent 11: The sun was out, but the breeze was cold.
Sent 12: Joey and Jimmy got out of the water and started walking home.
Sent 13: Their fur was wet, and the breeze chilled them.
Sent 14: When they got home, they dried off, and Jimmy put on his favorite purple shirt.
Sent 15: Joey put on a blue shirt with red and green dots.
Sent 16: The two squirrels ate some food that Joey’s mom, Jasmine, made and went off to bed.
'''
]
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | False | questions: Why was Joey surprised the morning ... |
1 | True | questions: Why was Joey surprised the morning ... |
Decide for two sentence
s with a shared disambigous word
wether they have the target word has the same semantic meaning
in both sentences.
This is a sub-task of SuperGLUE.
Predicted | disambigous word | Sentence 1 | Sentence 2 |
---|---|---|---|
False | kill | He totally killed that rock show! | The airplane crash killed his family |
True | window | The expanded window will give us time to catch the thieves. | You have a two-hour window for turning in your homework. |
False | window | He jumped out of the window. | You have a two-hour window for turning in your homework. |
.setTask('wic pos:)
followed by sentence1:
prefix for the first sentence, followed by sentence2:
prefix for the second sentence.
wic pos:
sentence1: The expanded window will give us time to catch the thieves.
sentence2: You have a two-hour window of turning in your homework.
word : window
# Set the task on T5
t5['t5'].setTask('wic ')
# define Data, add additional tags between sentences
data = [
'''
pos:
sentence1: The expanded window will give us time to catch the thieves.
sentence2: You have a two-hour window of turning in your homework.
word : window
''',]
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | True | pos: sentence1: The expanded window will give ... |
Predict for an ambiguous pronoun
to which noun
it is referring to.
This is a sub-task of GLUE and SuperGLUE.
Prediction | Text |
---|---|
stable | The stable was very roomy, with four good stalls; a large swinging window opened into the yard , which made it pleasant and airy. |
.setTask('wsc:)
and surround pronoun with asteriks symbols..
The ambiguous pronous
should be surrounded with *
symbols.
*Note* Read Appendix A. for more info
wsc:
The stable was very roomy, with four good stalls; a large swinging window opened into the yard , which made *it* pleasant and airy.
# Does not work yet 100% correct
# Set the task on T5
t5['t5'].setTask('wsc ')
# define Data, add additional tags between sentences
data = ['''The stable was very roomy, with four good stalls; a large swinging window opened into the yard , which made *it* pleasant and airy.''']
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | wsc The stable was very roomy, with four good ... | The stable was very roomy, with four good stal... |
Summarizes
a paragraph into a shorter version with the same semantic meaning.
Predicted summary | Text |
---|---|
manchester united face newcastle in the premier league on wednesday . louis van gaal's side currently sit two points clear of liverpool in fourth . the belgian duo took to the dance floor on monday night with some friends . | the belgian duo took to the dance floor on monday night with some friends . manchester united face newcastle in the premier league on wednesday . red devils will be looking for just their second league away win in seven . louis van gaal’s side currently sit two points clear of liverpool in fourth . |
.setTask('summarize:)
This task requires no pre-processing, setting the task to summarize
is sufficient.
the belgian duo took to the dance floor on monday night with some friends . manchester united face newcastle in the premier league on wednesday . red devils will be looking for just their second league away win in seven . louis van gaal’s side currently sit two points clear of liverpool in fourth .
# Set the task on T5
t5['t5'].setTask('summarize ')
# define Data, add additional tags between sentences
data = [
'''
The belgian duo took to the dance floor on monday night with some friends . manchester united face newcastle in the premier league on wednesday . red devils will be looking for just their second league away win in seven . louis van gaal’s side currently sit two points clear of liverpool in fourth .
''',
''' Calculus, originally called infinitesimal calculus or "the calculus of infinitesimals", is the mathematical study of continuous change, in the same way that geometry is the study of shape and algebra is the study of generalizations of arithmetic operations. It has two major branches, differential calculus and integral calculus; the former concerns instantaneous rates of change, and the slopes of curves, while integral calculus concerns accumulation of quantities, and areas under or between curves. These two branches are related to each other by the fundamental theorem of calculus, and they make use of the fundamental notions of convergence of infinite sequences and infinite series to a well-defined limit.[1] Infinitesimal calculus was developed independently in the late 17th century by Isaac Newton and Gottfried Wilhelm Leibniz.[2][3] Today, calculus has widespread uses in science, engineering, and economics.[4] In mathematics education, calculus denotes courses of elementary mathematical analysis, which are mainly devoted to the study of functions and limits. The word calculus (plural calculi) is a Latin word, meaning originally "small pebble" (this meaning is kept in medicine – see Calculus (medicine)). Because such pebbles were used for calculation, the meaning of the word has evolved and today usually means a method of computation. It is therefore used for naming specific methods of calculation and related theories, such as propositional calculus, Ricci calculus, calculus of variations, lambda calculus, and process calculus.'''
]
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | manchester united face newcastle in the premie... | The belgian duo took to the dance floor on mon... |
1 | calculus, originally called infinitesimal calc... | Calculus, originally called infinitesimal calc... |
Predict an answer
to a question
based on input context
.
Predicted Answer | Question | Context |
---|---|---|
carbon monoxide | What does increased oxygen concentrations in the patient’s lungs displace? | Hyperbaric (high-pressure) medicine uses special oxygen chambers to increase the partial pressure of O 2 around the patient and, when needed, the medical staff. Carbon monoxide poisoning, gas gangrene, and decompression sickness (the ’bends’) are sometimes treated using these devices. Increased O 2 concentration in the lungs helps to displace carbon monoxide from the heme group of hemoglobin. Oxygen gas is poisonous to the anaerobic bacteria that cause gas gangrene, so increasing its partial pressure helps kill them. Decompression sickness occurs in divers who decompress too quickly after a dive, resulting in bubbles of inert gas, mostly nitrogen and helium, forming in their blood. Increasing the pressure of O 2 as soon as possible is part of the treatment. |
pie | What did Joey eat for breakfast? | Once upon a time, there was a squirrel named Joey. Joey loved to go outside and play with his cousin Jimmy. Joey and Jimmy played silly games together, and were always laughing. One day, Joey and Jimmy went swimming together 50 at their Aunt Julie’s pond. Joey woke up early in the morning to eat some food before they left. Usually, Joey would eat cereal, fruit (a pear), or oatmeal for breakfast. After he ate, he and Jimmy went to the pond. On their way there they saw their friend Jack Rabbit. They dove into the water and swam for several hours. The sun was out, but the breeze was cold. Joey and Jimmy got out of the water and started walking home. Their fur was wet, and the breeze chilled them. When they got home, they dried off, and Jimmy put on his favorite purple shirt. Joey put on a blue shirt with red and green dots. The two squirrels ate some food that Joey’s mom, Jasmine, made and went off to bed,' |
.setTask('question:)
and prefix the context which can be made up of multiple sentences with context:
question: What does increased oxygen concentrations in the patient’s lungs displace?
context: Hyperbaric (high-pressure) medicine uses special oxygen chambers to increase the partial pressure of O 2 around the patient and, when needed, the medical staff. Carbon monoxide poisoning, gas gangrene, and decompression sickness (the ’bends’) are sometimes treated using these devices. Increased O 2 concentration in the lungs helps to displace carbon monoxide from the heme group of hemoglobin. Oxygen gas is poisonous to the anaerobic bacteria that cause gas gangrene, so increasing its partial pressure helps kill them. Decompression sickness occurs in divers who decompress too quickly after a dive, resulting in bubbles of inert gas, mostly nitrogen and helium, forming in their blood. Increasing the pressure of O 2 as soon as possible is part of the treatment.
# Set the task on T5
t5['t5'].setTask('question ')
# define Data, add additional tags between sentences
data = ['''
What does increased oxygen concentrations in the patient’s lungs displace?
context: Hyperbaric (high-pressure) medicine uses special oxygen chambers to increase the partial pressure of O 2 around the patient and, when needed, the medical staff. Carbon monoxide poisoning, gas gangrene, and decompression sickness (the ’bends’) are sometimes treated using these devices. Increased O 2 concentration in the lungs helps to displace carbon monoxide from the heme group of hemoglobin. Oxygen gas is poisonous to the anaerobic bacteria that cause gas gangrene, so increasing its partial pressure helps kill them. Decompression sickness occurs in divers who decompress too quickly after a dive, resulting in bubbles of inert gas, mostly nitrogen and helium, forming in their blood. Increasing the pressure of O 2 as soon as possible is part of the treatment.
''']
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | carbon monoxide | What does increased oxygen concentrations in t... |
For translation tasks use the marian
model
.setTask('translate English to German:)
# Set the task on T5
t5['t5'].setTask('translate English to German: ')
# define Data, add additional tags between sentences
sentences = ['''I like sausage and Tea for breakfast with potatoes''']
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | Die heutige Variante des Oxygen-Stahls bietet ... | What does increased oxygen concentrations in t... |
For translation tasks use the marian
model
.setTask('translate English to French:)
# Set the task on T5
t5['t5'].setTask('translate English to French: ')
# define Data, add additional tags between sentences
data = ['''I like sausage and Tea for breakfast with potatoes''']
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | J'aime les saucisses et le thé au petit déjeun... | I like sausage and Tea for breakfast with pota... |
For translation tasks use the marian
model
.setTask('translate English to Romanian:)
# Set the task on T5
t5['t5'].setTask('translate English to Romanian: ')
# define Data, add additional tags between sentences
data = [ '''I like sausage and Tea for breakfast with potatoes''']
#Predict on text data with T5
t5.predict(data)
T5 | document | |
---|---|---|
origin_index | ||
0 | Mi-ar plăcea cârnaţi şi ceai la micul dejun cu... | I like sausage and Tea for breakfast with pota... |