pec

Les références:

heureux

Utilisez la commande suivante pour charger cet ensemble de données dans TFDS :

ds = tfds.load('huggingface:pec/happy')
  • Description :
A dataset of around 350K persona-based empathetic conversations. 
Each speaker is associated with a persona, which comprises multiple persona sentences. 
The response of each conversation is empathetic.
  • Licence : Aucune licence connue
  • Version : 1.0.0
  • Divisions :
Diviser Exemples
'test' 22730
'train' 157195
'validation' 19829
  • Caractéristiques :
{
    "personas": {
        "feature": {
            "dtype": "string",
            "id": null,
            "_type": "Value"
        },
        "length": -1,
        "id": null,
        "_type": "Sequence"
    },
    "context": {
        "feature": {
            "dtype": "string",
            "id": null,
            "_type": "Value"
        },
        "length": -1,
        "id": null,
        "_type": "Sequence"
    },
    "context_speakers": {
        "feature": {
            "dtype": "string",
            "id": null,
            "_type": "Value"
        },
        "length": -1,
        "id": null,
        "_type": "Sequence"
    },
    "response": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_speaker": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    }
}

hors de ma poitrine

Utilisez la commande suivante pour charger cet ensemble de données dans TFDS :

ds = tfds.load('huggingface:pec/offmychest')
  • Description :
A dataset of around 350K persona-based empathetic conversations. 
Each speaker is associated with a persona, which comprises multiple persona sentences. 
The response of each conversation is empathetic.
  • Licence : Aucune licence connue
  • Version : 1.0.0
  • Divisions :
Diviser Exemples
'test' 15324
'train' 123968
'validation' 16004
  • Caractéristiques :
{
    "personas": {
        "feature": {
            "dtype": "string",
            "id": null,
            "_type": "Value"
        },
        "length": -1,
        "id": null,
        "_type": "Sequence"
    },
    "context": {
        "feature": {
            "dtype": "string",
            "id": null,
            "_type": "Value"
        },
        "length": -1,
        "id": null,
        "_type": "Sequence"
    },
    "context_speakers": {
        "feature": {
            "dtype": "string",
            "id": null,
            "_type": "Value"
        },
        "length": -1,
        "id": null,
        "_type": "Sequence"
    },
    "response": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_speaker": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    }
}

tous

Utilisez la commande suivante pour charger cet ensemble de données dans TFDS :

ds = tfds.load('huggingface:pec/all')
  • Description :
A dataset of around 350K persona-based empathetic conversations. 
Each speaker is associated with a persona, which comprises multiple persona sentences. 
The response of each conversation is empathetic.
  • Licence : Aucune licence connue
  • Version : 1.0.0
  • Divisions :
Diviser Exemples
'test' 38054
'train' 281163
'validation' 35833
  • Caractéristiques :
{
    "personas": {
        "feature": {
            "dtype": "string",
            "id": null,
            "_type": "Value"
        },
        "length": -1,
        "id": null,
        "_type": "Sequence"
    },
    "context": {
        "feature": {
            "dtype": "string",
            "id": null,
            "_type": "Value"
        },
        "length": -1,
        "id": null,
        "_type": "Sequence"
    },
    "context_speakers": {
        "feature": {
            "dtype": "string",
            "id": null,
            "_type": "Value"
        },
        "length": -1,
        "id": null,
        "_type": "Sequence"
    },
    "response": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "response_speaker": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    }
}