multi_re_qa

Bibliografia:

SzukajQA

Użyj następującego polecenia, aby załadować ten zestaw danych do TFDS:

ds = tfds.load('huggingface:multi_re_qa/SearchQA')
  • Opis :
MultiReQA contains the sentence boundary annotation from eight publicly available QA datasets including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, and TextbookQA. Five of these datasets, including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, contain both training and test data, and three, including BioASQ, RelationExtraction, TextbookQA, contain only the test data
  • Licencja : Brak znanej licencji
  • Wersja : 1.0.0
  • Podziały :
Podział Przykłady
'train' 3163801
'validation' 454836
  • Cechy :
{
    "candidate_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_start": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    },
    "response_end": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    }
}

CiekawostkiQA

Użyj następującego polecenia, aby załadować ten zestaw danych do TFDS:

ds = tfds.load('huggingface:multi_re_qa/TriviaQA')
  • Opis :
MultiReQA contains the sentence boundary annotation from eight publicly available QA datasets including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, and TextbookQA. Five of these datasets, including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, contain both training and test data, and three, including BioASQ, RelationExtraction, TextbookQA, contain only the test data
  • Licencja : Brak znanej licencji
  • Wersja : 1.0.0
  • Podziały :
Podział Przykłady
'train' 1893674
'validation' 238339
  • Cechy :
{
    "candidate_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_start": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    },
    "response_end": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    }
}

HotpotQA

Użyj następującego polecenia, aby załadować ten zestaw danych do TFDS:

ds = tfds.load('huggingface:multi_re_qa/HotpotQA')
  • Opis :
MultiReQA contains the sentence boundary annotation from eight publicly available QA datasets including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, and TextbookQA. Five of these datasets, including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, contain both training and test data, and three, including BioASQ, RelationExtraction, TextbookQA, contain only the test data
  • Licencja : Brak znanej licencji
  • Wersja : 1.0.0
  • Podziały :
Podział Przykłady
'train' 508879
'validation' 52191
  • Cechy :
{
    "candidate_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_start": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    },
    "response_end": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    }
}

Drużyna

Użyj następującego polecenia, aby załadować ten zestaw danych do TFDS:

ds = tfds.load('huggingface:multi_re_qa/SQuAD')
  • Opis :
MultiReQA contains the sentence boundary annotation from eight publicly available QA datasets including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, and TextbookQA. Five of these datasets, including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, contain both training and test data, and three, including BioASQ, RelationExtraction, TextbookQA, contain only the test data
  • Licencja : Brak znanej licencji
  • Wersja : 1.0.0
  • Podziały :
Podział Przykłady
'train' 95659
'validation' 10642
  • Cechy :
{
    "candidate_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_start": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    },
    "response_end": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    }
}

Naturalne pytania

Użyj następującego polecenia, aby załadować ten zestaw danych do TFDS:

ds = tfds.load('huggingface:multi_re_qa/NaturalQuestions')
  • Opis :
MultiReQA contains the sentence boundary annotation from eight publicly available QA datasets including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, and TextbookQA. Five of these datasets, including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, contain both training and test data, and three, including BioASQ, RelationExtraction, TextbookQA, contain only the test data
  • Licencja : Brak znanej licencji
  • Wersja : 1.0.0
  • Podziały :
Podział Przykłady
'train' 448355
'validation' 22118
  • Cechy :
{
    "candidate_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_start": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    },
    "response_end": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    }
}

BioASQ

Użyj następującego polecenia, aby załadować ten zestaw danych do TFDS:

ds = tfds.load('huggingface:multi_re_qa/BioASQ')
  • Opis :
MultiReQA contains the sentence boundary annotation from eight publicly available QA datasets including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, and TextbookQA. Five of these datasets, including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, contain both training and test data, and three, including BioASQ, RelationExtraction, TextbookQA, contain only the test data
  • Licencja : Brak znanej licencji
  • Wersja : 1.0.0
  • Podziały :
Podział Przykłady
'test' 14158
  • Cechy :
{
    "candidate_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_start": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    },
    "response_end": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    }
}

Ekstrakcja relacji

Użyj następującego polecenia, aby załadować ten zestaw danych do TFDS:

ds = tfds.load('huggingface:multi_re_qa/RelationExtraction')
  • Opis :
MultiReQA contains the sentence boundary annotation from eight publicly available QA datasets including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, and TextbookQA. Five of these datasets, including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, contain both training and test data, and three, including BioASQ, RelationExtraction, TextbookQA, contain only the test data
  • Licencja : Brak znanej licencji
  • Wersja : 1.0.0
  • Podziały :
Podział Przykłady
'test' 3301
  • Cechy :
{
    "candidate_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_start": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    },
    "response_end": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    }
}

PodręcznikQA

Użyj następującego polecenia, aby załadować ten zestaw danych do TFDS:

ds = tfds.load('huggingface:multi_re_qa/TextbookQA')
  • Opis :
MultiReQA contains the sentence boundary annotation from eight publicly available QA datasets including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, and TextbookQA. Five of these datasets, including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, contain both training and test data, and three, including BioASQ, RelationExtraction, TextbookQA, contain only the test data
  • Licencja : Brak znanej licencji
  • Wersja : 1.0.0
  • Podziały :
Podział Przykłady
'test' 71147
  • Cechy :
{
    "candidate_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_start": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    },
    "response_end": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    }
}

DuoRC

Użyj następującego polecenia, aby załadować ten zestaw danych do TFDS:

ds = tfds.load('huggingface:multi_re_qa/DuoRC')
  • Opis :
MultiReQA contains the sentence boundary annotation from eight publicly available QA datasets including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, and TextbookQA. Five of these datasets, including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, contain both training and test data, and three, including BioASQ, RelationExtraction, TextbookQA, contain only the test data
  • Licencja : Brak znanej licencji
  • Wersja : 1.0.0
  • Podziały :
Podział Przykłady
'test' 5525
  • Cechy :
{
    "candidate_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_start": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    },
    "response_end": {
        "dtype": "int32",
        "id": null,
        "_type": "Value"
    }
}