wiki40b

参考文献:

jp

次のコマンドを使用して、このデータセットを TFDS にロードします。

ds = tfds.load('huggingface:wiki40b/en')
  • 説明
Clean-up text for 40+ Wikipedia languages editions of pages
correspond to entities. The datasets have train/dev/test splits per language.
The dataset is cleaned up by page filtering to remove disambiguation pages,
redirect pages, deleted pages, and non-entity pages. Each example contains the
wikidata id of the entity, and the full Wikipedia article after page processing
that removes non-content sections and structured objects.
  • ライセンス: 不明なライセンス
  • バージョン: 1.1.0
  • 分割:
スプリット
'test' 162274
'train' 2926536
'validation' 163597
  • 特徴
{
    "wikidata_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "text": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    },
    "version_id": {
        "dtype": "string",
        "id": null,
        "_type": "Value"
    }
}