blurr
  • Getting Started
  • Resources
    • fastai x Hugging Face Study Group
    • Hugging Face Course
    • fast.ai (docs)
    • transformers (docs)
  • Help
    • Report an Issue

Data

  • Overview
    • Getting Started
    • callbacks
    • utils
  • Text
    • Sequence Classification
      • Data
      • Modeling
    • Token Classification
      • Data
      • Modeling
    • Question & Answering
      • Data
      • Modeling
    • Language Modeling
      • Data
      • Modeling
    • Seq2Seq: Core
      • Data
      • Modeling
    • Seq2Seq: Summarization
      • Data
      • Modeling
    • Seq2Seq: Translation
      • Data
      • Modeling
    • callbacks
    • utils
  • Examples
    • Using the high-level Blurr API
    • GLUE classification tasks
    • Using the Low-level fastai API
    • Multi-label classification
    • Causal Language Modeling with GPT-2

On this page

  • Setup
  • Preprocessing
    • QAPreprocessor
      • How to preprocess your data
  • Mid-level API
    • QATextInput
    • QABatchTokenizeTransform
  • Examples
    • Using the mid-level API
      • Batch-Time Tokenization
      • Passing extra information

Report an issue

Data

The text.data.question_answering module contains the bits required to use the fastai DataBlock API and/or mid-level data processing pipelines to organize your data for question/answering tasks. Question/Answering tasks are models that require two text inputs (a context that includes the answer and the question). The objective is to predict the start/end tokens of the answer in the context).
What we're running with at the time this documentation was generated:
torch: 1.9.0+cu102
fastai: 2.7.9
transformers: 4.21.2

Setup

We’ll use a subset of squad_v2 to demonstrate how to configure your blurr code for extractive question answering

raw_datasets = load_dataset("squad_v2", split=["train[:1000]", "validation[:200]"])
Reusing dataset squad_v2 (/home/wgilliam/.cache/huggingface/datasets/squad_v2/squad_v2/2.0.0/09187c73c1b837c95d9a249cd97c2c3f1cebada06efe667b4427714b27639b1d)
raw_train_ds, raw_valid_ds = raw_datasets[0], raw_datasets[1]
raw_train_df = pd.DataFrame(raw_train_ds)
raw_valid_df = pd.DataFrame(raw_valid_ds)

raw_train_df["is_valid"] = False
raw_valid_df["is_valid"] = True

print(len(raw_train_df))
print(len(raw_valid_df))
1000
200
raw_train_df.head(2)
id title context question answers is_valid
0 56be85543aeaaa14008c9063 Beyoncé Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... When did Beyonce start becoming popular? {'text': ['in the late 1990s'], 'answer_start': [269]} False
1 56be85543aeaaa14008c9065 Beyoncé Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... What areas did Beyonce compete in when she was growing up? {'text': ['singing and dancing'], 'answer_start': [207]} False
raw_valid_df.head(2)
id title context question answers is_valid
0 56ddde6b9a695914005b9628 Normans The Normans (Norman: Nourmands; French: Normands; Latin: Normanni) were the people who in the 10... In what country is Normandy located? {'text': ['France', 'France', 'France', 'France'], 'answer_start': [159, 159, 159, 159]} True
1 56ddde6b9a695914005b9629 Normans The Normans (Norman: Nourmands; French: Normands; Latin: Normanni) were the people who in the 10... When were the Normans in Normandy? {'text': ['10th and 11th centuries', 'in the 10th and 11th centuries', '10th and 11th centuries'... True
squad_df = pd.concat([raw_train_df, raw_valid_df])
len(squad_df)
1200
squad_df["ans_start_char_idx"] = squad_df.answers.apply(lambda v: v["answer_start"][0] if len(v["answer_start"]) > 0 else "0")
squad_df["answer_text"] = squad_df.answers.apply(lambda v: v["text"][0] if len(v["text"]) > 0 else "")
squad_df["ans_end_char_idx"] = squad_df["ans_start_char_idx"].astype(int) + squad_df["answer_text"].str.len()

print(len(squad_df))
squad_df[squad_df.is_valid == True].head(2)
1200
id title context question answers is_valid ans_start_char_idx answer_text ans_end_char_idx
0 56ddde6b9a695914005b9628 Normans The Normans (Norman: Nourmands; French: Normands; Latin: Normanni) were the people who in the 10... In what country is Normandy located? {'text': ['France', 'France', 'France', 'France'], 'answer_start': [159, 159, 159, 159]} True 159 France 165
1 56ddde6b9a695914005b9629 Normans The Normans (Norman: Nourmands; French: Normands; Latin: Normanni) were the people who in the 10... When were the Normans in Normandy? {'text': ['10th and 11th centuries', 'in the 10th and 11th centuries', '10th and 11th centuries'... True 94 10th and 11th centuries 117
model_cls = AutoModelForQuestionAnswering
hf_logging.set_verbosity_error()

pretrained_model_name = "roberta-base"  #'xlm-mlm-ende-1024'
hf_arch, hf_config, hf_tokenizer, hf_model = get_hf_objects(pretrained_model_name, model_cls=model_cls)

max_seq_len = 128
vocab = dict(enumerate(range(max_seq_len)))

Preprocessing

With version 2.0.0 of BLURR, we include a Preprocessor for question answering that can either truncate texts or else chunk long documents into multiple examples.

Note: Unlike other NLP tasks in BLURR, extractive question answering requires preprocessing in order to convert our raw start/end character indices into start/end token indices unless your dataset includes the later. Token indicies, rather than character indices, will be used as our targets and are dependent on your tokenizer of choice.


source

QAPreprocessor

 QAPreprocessor (hf_tokenizer:transformers.tokenization_utils_base.PreTrai
                 nedTokenizerBase, batch_size:int=1000,
                 id_attr:Optional[str]=None, ctx_attr:str='context',
                 qst_attr:str='question', ans_attr:str='answer_text',
                 ans_start_char_idx:str='ans_start_char_idx',
                 ans_end_char_idx:str='ans_end_char_idx',
                 is_valid_attr:Optional[str]='is_valid',
                 tok_kwargs:dict={'return_overflowing_tokens': True})

Initialize self. See help(type(self)) for accurate signature.

Type Default Details
hf_tokenizer PreTrainedTokenizerBase A Hugging Face tokenizer
batch_size int 1000 The number of examples to process at a time
id_attr Optional None The unique identifier in the dataset. If not specified and “return_overflowing_tokens”: True, an “_id” attribute
will be added to your dataset with its value a unique, sequential integer, assigned to each record
ctx_attr str context The attribute in your dataset that contains the context (where the answer is included) (default: ‘context’)
qst_attr str question The attribute in your dataset that contains the question being asked (default: ‘question’)
ans_attr str answer_text The attribute in your dataset that contains the actual answer (default: ‘answer_text’)
ans_start_char_idx str ans_start_char_idx The attribute in your dataset that contains the actual answer (default: ‘answer_text’)
ans_end_char_idx str ans_end_char_idx The attribute in your dataset that contains the actual answer (default: ‘answer_text’)
is_valid_attr Optional is_valid The attribute that should be created if your are processing individual training and validation
datasets into a single dataset, and will indicate to which each example is associated
tok_kwargs dict {‘return_overflowing_tokens’: True} Tokenization kwargs that will be applied with calling the tokenizer (default: {“return_overflowing_tokens”: True})

How to preprocess your data

tok_kwargs = {"return_overflowing_tokens": True, "max_length": max_seq_len, "stride": 64}
preprocessor = QAPreprocessor(hf_tokenizer, id_attr="id", tok_kwargs=tok_kwargs)
proc_df = preprocessor.process_df(squad_df)

print(len(proc_df))
proc_df.head(4)
3560
id title context question answers is_valid ans_start_char_idx answer_text ans_end_char_idx proc_question proc_context ans_start_token_idx ans_end_token_idx is_answerable
0 56be85543aeaaa14008c9063 Beyoncé Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... When did Beyonce start becoming popular? {'text': ['in the late 1990s'], 'answer_start': [269]} False 269 in the late 1990s 286 When did Beyonce start becoming popular? Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... 84 89 True
1 56be85543aeaaa14008c9063 Beyoncé Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... When did Beyonce start becoming popular? {'text': ['in the late 1990s'], 'answer_start': [269]} False 269 in the late 1990s 286 When did Beyonce start becoming popular? in Houston, Texas, she performed in various singing and dancing competitions as a child, and ro... 32 37 True
2 56be85543aeaaa14008c9063 Beyoncé Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... When did Beyonce start becoming popular? {'text': ['in the late 1990s'], 'answer_start': [269]} False 269 in the late 1990s 286 When did Beyonce start becoming popular? group became one of the world's best-selling girl groups of all time. Their hiatus saw the rele... 0 0 False
3 56be85543aeaaa14008c9065 Beyoncé Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... What areas did Beyonce compete in when she was growing up? {'text': ['singing and dancing'], 'answer_start': [207]} False 207 singing and dancing 226 What areas did Beyonce compete in when she was growing up? Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... 77 80 True
sampled_df = proc_df.sample(n=10)
for row_idx, row in sampled_df.iterrows():
    test_example = row

    inputs = hf_tokenizer(row.proc_question, row.proc_context)

    if test_example.is_answerable:
        # print(test_example.answer_text)
        test_eq(
            test_example.answer_text,
            hf_tokenizer.decode(inputs["input_ids"][test_example.ans_start_token_idx : test_example.ans_end_token_idx]).strip(),
        )
    else:
        test_eq(test_example.ans_start_token_idx, 0)
        test_eq(test_example.ans_end_token_idx, 0)

If you want to remove texts longer than your model will hold (and include only answerable contexts)

preprocessor = QAPreprocessor(hf_tokenizer, tok_kwargs={"return_overflowing_tokens": False, "max_length": max_seq_len})
proc2_df = preprocessor.process_df(squad_df)
proc2_df = proc2_df[(proc2_df.ans_end_token_idx < max_seq_len) & (proc2_df.is_answerable)]

print(len(proc2_df))
proc2_df.head(2)
763
id title context question answers is_valid ans_start_char_idx answer_text ans_end_char_idx proc_question proc_context ans_start_token_idx ans_end_token_idx is_answerable
0 56be85543aeaaa14008c9063 Beyoncé Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... When did Beyonce start becoming popular? {'text': ['in the late 1990s'], 'answer_start': [269]} False 269 in the late 1990s 286 When did Beyonce start becoming popular? Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... 84 89 True
1 56be85543aeaaa14008c9065 Beyoncé Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... What areas did Beyonce compete in when she was growing up? {'text': ['singing and dancing'], 'answer_start': [207]} False 207 singing and dancing 226 What areas did Beyonce compete in when she was growing up? Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... 77 80 True

Mid-level API


source

QATextInput

 QATextInput (x, **kwargs)

The base represenation of your inputs; used by the various fastai show methods


source

QABatchTokenizeTransform

 QABatchTokenizeTransform (hf_arch:str,
                           hf_config:transformers.configuration_utils.Pret
                           rainedConfig, hf_tokenizer:transformers.tokeniz
                           ation_utils_base.PreTrainedTokenizerBase, hf_mo
                           del:transformers.modeling_utils.PreTrainedModel
                           , include_labels:bool=True,
                           ignore_token_id=-100, max_length:int=None,
                           padding:Union[bool,str]=True,
                           truncation:Union[bool,str]='only_second',
                           is_split_into_words:bool=False,
                           tok_kwargs:dict={}, **kwargs)

Handles everything you need to assemble a mini-batch of inputs and targets, as well as decode the dictionary produced as a byproduct of the tokenization process in the encodes method.

Type Default Details
hf_arch str The abbreviation/name of your Hugging Face transformer architecture (e.b., bert, bart, etc..)
hf_config PretrainedConfig A specific configuration instance you want to use
hf_tokenizer PreTrainedTokenizerBase A Hugging Face tokenizer
hf_model PreTrainedModel A Hugging Face model
include_labels bool True To control whether the “labels” are included in your inputs. If they are, the loss will be calculated in
the model’s forward function and you can simply use PreCalculatedLoss as your Learner’s loss function to use it
ignore_token_id int -100 The token ID that should be ignored when calculating the loss
max_length int None To control the length of the padding/truncation. It can be an integer or None,

in which case it will default to the maximum length the model can accept. If the model has no specific maximum input length, truncation/padding to max_length is deactivated. See Everything you always wanted to know about padding and truncation | | padding | Union | True | To control the padding applied to your hf_tokenizer during tokenization. If None, will default to False or 'do_not_pad'. See [Everything you always wanted to know about padding and truncation](https://huggingface.co/transformers/preprocessing.html#everything-you-always-wanted-to-know-about-padding-and-truncation) | | truncation | Union | only_second | To controltruncationapplied to yourhf_tokenizerduring tokenization. If None, will default toFalseordo_not_truncate. See [Everything you always wanted to know about padding and truncation](https://huggingface.co/transformers/preprocessing.html#everything-you-always-wanted-to-know-about-padding-and-truncation) | | is_split_into_words | bool | False | Theis_split_into_wordsargument applied to yourhf_tokenizerduring tokenization. Set this toTrueif your inputs are pre-tokenized (not numericalized) | | tok_kwargs | dict | {} | Any other keyword arguments you want included when using yourhf_tokenizer` to tokenize your inputs. | | kwargs | | | |

Examples

The following eamples demonstrate several approaches to construct your DataBlock for question answering tasks using the mid-level API

Using the mid-level API

Batch-Time Tokenization

Step 1: Get your Hugging Face objects
hf_logging.set_verbosity_error()

pretrained_model_name = "distilroberta-base"
hf_arch, hf_config, hf_tokenizer, hf_model = get_hf_objects(pretrained_model_name, model_cls=AutoModelForQuestionAnswering)

max_seq_len = 128
vocab = dict(enumerate(range(max_seq_len)))
Step 2: Preprocess dataset
tok_kwargs = {"return_overflowing_tokens": True, "max_length": max_seq_len, "stride": 24}
preprocessor = QAPreprocessor(hf_tokenizer, id_attr="id", tok_kwargs=tok_kwargs)
proc_df = preprocessor.process_df(squad_df)

proc_df.head(1)
id title context question answers is_valid ans_start_char_idx answer_text ans_end_char_idx proc_question proc_context ans_start_token_idx ans_end_token_idx is_answerable
0 56be85543aeaaa14008c9063 Beyoncé Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... When did Beyonce start becoming popular? {'text': ['in the late 1990s'], 'answer_start': [269]} False 269 in the late 1990s 286 When did Beyonce start becoming popular? Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... 84 89 True
Step 3: Create your DataBlock
before_batch_tfm = QABatchTokenizeTransform(hf_arch, hf_config, hf_tokenizer, hf_model, max_length=max_seq_len)

blocks = (
    TextBlock(batch_tokenize_tfm=before_batch_tfm, input_return_type=QATextInput),
    CategoryBlock(vocab=vocab),
    CategoryBlock(vocab=vocab),
)

dblock = DataBlock(
    blocks=blocks,
    get_x=lambda x: (x.proc_question, x.proc_context),
    get_y=[ColReader("ans_start_token_idx"), ColReader("ans_end_token_idx")],
    splitter=ColSplitter(),
    n_inp=1,
)
Step 4: Build your DataLoaders
dls = dblock.dataloaders(proc_df, bs=4)
len(dls.train), len(dls.valid)
(590, 94)
b = dls.one_batch()
len(b), len(b[0]), len(b[1]), len(b[2])
(3, 8, 4, 4)
b[0]["input_ids"].shape, b[0]["attention_mask"].shape, b[1].shape, b[2].shape
(torch.Size([4, 128]), torch.Size([4, 128]), torch.Size([4]), torch.Size([4]))
b[0]["start_positions"], b[0]["end_positions"]
(TensorCategory([ 0,  0, 85,  0], device='cuda:1'),
 TensorCategory([ 0,  0, 87,  0], device='cuda:1'))

The show_batch method above allows us to create a more interpretable view of our question/answer data.

dls.show_batch(dataloaders=dls, max_n=4)
text found start/end answer
0 Beyonce has been awarded how many Grammy nominations? ously in Love, B'Day and I Am... Sasha Fierce have all won Best Contemporary R&B Album. Beyoncé set the record for the most Grammy awards won by a female artist in one night in 2010 when she won six awards, breaking the tie she previously held with Alicia Keys, Norah Jones, Alison Krauss, and Amy Winehouse, with Adele equaling this in 2012. Following her role in Dreamgirls she was nominated for Best Original Song for "Listen" and Best Actress at the Golden Globe Awards, and Outstanding Actress False (0, 0)
1 Who did Beyonce record the lead single with in the movie "The Fighting Temptations"? cé starred opposite Cuba Gooding, Jr., in the musical comedy The Fighting Temptations as Lilly, a single mother whom Gooding's character falls in love with. The film received mixed reviews from critics but grossed $30 million in the U.S. Beyoncé released "Fighting Temptation" as the lead single from the film's soundtrack album, with Missy Elliott, MC Lyte, and Free which was also used to promote the film. Another of Beyoncé's contributions to the soundtrack, " True (97, 100) Missy Elliott
2 What did Bryan Lessard name after Beyoncé?'s "Say My Name" and discussed his relationship with women. In January 2012, research scientist Bryan Lessard named Scaptia beyonceae, a species of horse fly found in Northern Queensland, Australia after Beyoncé due to the fly's unique golden hairs on its abdomen. In July 2014, a Beyoncé exhibit was introduced into the "Legends of Rock" section of the Rock and Roll Hall of Fame. The black leotard from the "Single Ladies" video and her outfit from the Super Bowl half time performance are among several pieces housed at True (45, 50) a species of horse fly
3 How many awards did Beyonce take home with her at the 57th Grammy Awards? ogue magazine was unveiled online, Beyoncé as the cover star, becoming the first African-American artist and third African-American woman in general to cover the September issue. She headlined the 2015 Made in America festival in early September and also the Global Citizen Festival later that month. Beyoncé made an uncredited featured appearance on the track "Hymn for the Weekend" by British rock band Coldplay, on their seventh studio album A Head Full of Dreams (2015), which saw release in December. On January 7, 2016, False (0, 0)

Passing extra information

As mentioned in the data.core module documentation, BLURR now also allows you to pass extra information alongside your inputs in the form of a dictionary. If we are splitting long documents into chunks but want to predict/aggregation by example (rather than by chunk), we’ll need to include a unique identifier for each example. When we look at modeling.question_answer module, we’ll see how the question answering bits can use such an Id for this purpose.

Step 1: Get your Hugging Face objects
hf_logging.set_verbosity_error()

pretrained_model_name = "bert-large-uncased-whole-word-masking-finetuned-squad"
hf_arch, hf_config, hf_tokenizer, hf_model = get_hf_objects(pretrained_model_name, model_cls=AutoModelForQuestionAnswering)

max_seq_len = 128
vocab = dict(enumerate(range(max_seq_len)))
Step 2: Preprocess dataset
preprocessor = QAPreprocessor(
    hf_tokenizer, id_attr="id", tok_kwargs={"return_overflowing_tokens": True, "max_length": max_seq_len, "stride": 64}
)

proc_df = preprocessor.process_df(squad_df)
proc_df.head(1)
id title context question answers is_valid ans_start_char_idx answer_text ans_end_char_idx proc_question proc_context ans_start_token_idx ans_end_token_idx is_answerable
0 56be85543aeaaa14008c9063 Beyoncé Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... When did Beyonce start becoming popular? {'text': ['in the late 1990s'], 'answer_start': [269]} False 269 in the late 1990s 286 When did Beyonce start becoming popular? Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an America... 75 79 True
Step 2: Create your DataBlock
before_batch_tfm = QABatchTokenizeTransform(hf_arch, hf_config, hf_tokenizer, hf_model, max_length=max_seq_len)

blocks = (
    TextBlock(batch_tokenize_tfm=before_batch_tfm, input_return_type=QATextInput),
    CategoryBlock(vocab=vocab),
    CategoryBlock(vocab=vocab),
)

# since its preprocessed, we include an "text" key with the values of our question and context
def get_x(item):
    return {"text": (item.proc_question, item.proc_context), "id": item.id}


dblock = DataBlock(
    blocks=blocks,
    get_x=get_x,
    get_y=[ItemGetter("ans_start_token_idx"), ItemGetter("ans_end_token_idx")],
    splitter=ColSplitter(),
    n_inp=1,
)
Step 3: Build your DataLoaders
dls = dblock.dataloaders(proc_df, bs=4)
len(dls.train), len(dls.valid)
(733, 108)
b = dls.one_batch()
len(b), len(b[0]), len(b[1]), len(b[2])
(3, 10, 4, 4)
b[0].keys()
dict_keys(['input_ids', 'token_type_ids', 'attention_mask', 'special_tokens_mask', 'offset_mapping', 'id', 'cls_index', 'p_mask', 'start_positions', 'end_positions'])
b[0]["input_ids"].shape, b[0]["attention_mask"].shape, b[1].shape, b[2].shape
(torch.Size([4, 128]), torch.Size([4, 128]), torch.Size([4]), torch.Size([4]))

We can see that any additional data is now located in the inputs dictionary

b[0]["id"]
['56be8bab3aeaaa14008c90a1',
 '56d4cde92ccc5a1400d83239',
 '56bea8463aeaaa14008c91ac',
 '56becc903aeaaa14008c94a1']
dls.show_batch(dataloaders=dls, max_n=4)
text found start/end answer
0 who was the first record label to give the girls a record deal? ped and danced on the talent show circuit in houston. after seeing the group, r & b producer arne frager brought them to his northern california studio and placed them in star search, the largest talent show on national tv at the time. girl's tyme failed to win, and beyonce later said the song they performed was not good. in 1995 beyonce's father resigned from his job to manage the group. the move reduced beyonce's family's income by half, and her parents were forced to move into separated apartments. mathew False (0, 0)
1 who said that chopin set out " into the wide world, with no very clearly defined aim, forever? " cki, " into the wide world, with no very clearly defined aim, forever. " with woyciechowski, he headed for austria, intending to go on to italy. later that month, in warsaw, the november 1830 uprising broke out, and woyciechowski returned to poland to enlist. chopin, now alone in vienna, was nostalgic for his homeland, and wrote to a friend, " i curse the moment of my departure. " when in september 1831 he learned, while False (0, 0)
2 what short poem spoke of frederic's popularity as a child? yk and his family moved to a building, which still survives, adjacent to the kazimierz palace. during this period, fryderyk was sometimes invited to the belweder palace as playmate to the son of the ruler of russian poland, grand duke constantine ; he played the piano for the duke and composed a march for him. julian ursyn niemcewicz, in his dramatic eclogue, " nasze przebiegi " ( " our discourses ", 1818 ), attested to " little chopin's " popularity True (101, 107) nasze przebiegi
3 which national event caused beyonce to produce " demand a plan? " in a campaign video released on 15 may 2013, where she, along with cameron diaz, john legend and kylie minogue, described inspiration from their mothers, while a number of other artists celebrated personal inspiration from other women, leading to a call for submission of photos of women of viewers'inspiration from which a selection was shown at the concert. beyonce said about her mother tina knowles that her gift was " finding the best qualities in every human being. " with help of the crowdfunding platform catapult, visitors of the concert could choose between several projects promoting education False (0, 0)