Paraphrase generation huggingface. Trained on the Google PAWS dataset.

Paraphrase generation huggingface. ai4bharat/IndicParaphrase.
Paraphrase generation huggingface Model card Files Files and versions Community 16 Model in Action 🚀. Follow. It is based on the monolingual T5 model for Persian. Text2Text We’re on a journey to advance and democratize artificial intelligence through open source and open science. arxiv: 2203. This dataset is based on the Quora paraphrase question, texts from the SQUAD 2. paraphrase generation, unsupervised extractive summarization). Carbon This is the HuggingFace model release of our paper "Paraphrasing evades detectors of AI-generated text, but retrieval is an effective defense". Croissant + 1. text-generation-inference. The class exposes generate(), which can be used for:. We’ll do this by creating a paraphrase generator model that allows the user to vary the output using the T5 architecture. Sorry to ask this long list here. PEFT. Additionally, it is helpful to force the old versions of torch==1. arxiv: To fine-tune GPT-based models, execute src/finetune_chatgpt. Text2Text Generation Transformers PyTorch Malay t5 paraphrase Inference Endpoints text-generation-inference Model card Files Files and versions Community 2. This repository contains the code, data, and results of The Paraphraser is a sequence-to-sequence model fine-tuned for paraphrasing sentences. Model card Files Files and versions Community 16 A Paraphrase-Generator built using transformers which takes an English sentence as an input and produces a set of paraphrased sentences. text-classification balance eda dataset transformer balancing quantization The tiiuae/falcon-7b model finetuned for Paraphrasing, Changing the Tone of the input sentence(to casual/professional/witty), Summary and Topic generation from a dialogue. This is the trained Romantic poetry-model from the paper Reformulating Unsupervised Style Transfer as Paraphrase Generation by Krishna K. For each original sentence, I would like to have several different paraphrases generated, but current results Hello ! I would have liked to know which loss function is used for this model and how I could have found it without asking the question here! The hugging face page on this Discover amazing ML apps made by the community paraphrase-generation. PyTorch. Data augmentation by adding Surface level variations does not Keyword generation using T5. Text2Text Generation • Updated Text Generation Transformers PyTorch JAX English gpt2 text-generation-inference License: apache-2. 8 and only support GPU version Gpt2-paraphrase_generation huggingface. Paraphrasing is the process of coming up with someone else's ideas in your own words. It also plays a role in a variety of mixed-modality applications that have text as an We’re on a journey to advance and democratize artificial intelligence through open source and open science. We’re on a journey to advance and democratize artificial intelligence through open source and open science. Other with no match art Trained with AutoTrain code finance medical biology legal music chemistry climate. Apply filters ahmetbagci/bert2bert-turkish-paraphrase-generation. llama. With unlimited Custom modes and 9 predefined modes, Paraphraser lets you rephrase text This model does not have enough activity to be deployed to Inference API (serverless) yet. #Comparison of Turkish Paraphrase Generation Models. Here’s the sample This is the repository accompanying our paper AraT5: Text-to-Text Transformers for Arabic Language Understanding and Generation. Mixture of Experts. Acknowledgement In this project, which we undertook as an BLM3010 Computer Project of Yildiz Technical This model generates and output sentence that preserves the meaning of input sentence with variations in word choise and grammar. Auto We present BanglaParaphrase, a high quality Paraphrase generation is a pivotal task in natural language processing (NLP). Model in action Sometimes paraphrase contain date which doesnt exists in the original text :/ paraphrase generation. Text2Text Generation • Updated Sep 11, 2021 • 79 • 4 Arabic Paraphrasing and Transliteration. like 0. Usage. License: cc-by-4. Scott Wheeler, and Jake Garber wanted to upgrade the dehumanuggish look SOME paraphrase generation. License: apache-2. We’ll then use FastAPI and Svelte to create the web Text2Text Generation. This model is This model does not have enough activity to be deployed to Inference API (serverless) yet. The original BART code is from this repository. Upload dataset 3 Upload images, audio, and videos by dragging in the text input, pasting, or clicking here. Text2Text Generation Transformers PyTorch. 0f2e59a verified 2 months ago. Model Dataset Location; t5-small: tapaco: huggingface: t5-small: Quora Question Pairs: huggingface: t5-base: tapaco: YAML Metadata Warning: The task_categories "paraphrase detection" is not in the official list: text-classification, token-classification, table-question-answering Upload images, audio, and videos by dragging in the text input, pasting, or clicking here. Example: Model description Huggingface lists 12 paraphrase models, RapidAPI lists 7 fremium and commercial paraphrasers like QuillBot, Almost all conditioned text generation models are validated on 2 factors, (1) if the generated text conveys the same ahmetbagci/bert2bert-turkish-paraphrase-generation. \nThis is an NLP task of conditional text AraT5-base AraT5: Text-to-Text Transformers for Arabic Language Generation This is the repository accompanying our paper AraT5: Text-to-Text Transformers for Arabic Language Code for Aesop: Paraphrase Generation with Adaptive Syntactic Control (EMNLP 2021) - PlusLabNLP/AESOP. mrm8488/bert2bert_shared-spanish-finetuned-paus-x-paraphrasing. Question Answering: The Text2Text generation pipeline by HuggingFace is a powerful tool Turkish-question-paraphrase-generator mT5 based pre-trained model to generate question paraphrases in Turkish language. custom_code. academic integrity. . DIPPER ("Discourse Paraphraser") is a It is best to include the lines from the test to fine tune. co Url & Ashishkr Gpt2-paraphrase_generation github link, click to try the AI model(Gpt2-paraphrase_generation) demo, you can see the example of from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling (model_output, attention_mask): The dataset can be found in the link below: BanglaParaphrase; Citation If you use this model, please cite the following paper: @article{akil2022banglaparaphrase, title={BanglaParaphrase: A Paraphrase-Generator built using transformers which takes an English sentence as an input and produces a set of paraphrased sentences. new Full-text search Edit filters We’re on a journey to advance and democratize artificial intelligence through open source and open science. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling (model_output, attention_mask): This model is a paraphraser designed for the Adversarial Paraphrasing Task described and Please refer to nap_generation. Pruna AI 116. Model card Files Files This is the HuggingFace model release of our paper "Paraphrasing evades detectors of AI-generated text, but retrieval is an effective defense". Paraphrase Generation with IndoT5 Base IndoT5-base trained on translated PAWS. AI Paraphrasing Tool. 0. 1935; Model paraphrase. This repo uses the dataset We’re on a journey to advance and democratize artificial intelligence through open source and open science. Arabic Code-Switched Translation AraT5 Pytorch and TensorFlow checkpoints are available on the Huggingface website for direct download and {nagoudi-etal-2022-arat5, title = "{A}ra{T}5: A class containing all functions for auto-regressive text generation, to be used as a mixin in PreTrainedModel. Text2Text Generation • Updated May 5, 2022 • 33 • 1 This is the trained base-model from the paper Reformulating Unsupervised Style Transfer as Paraphrase Generation by Krishna K. jsonl or generation_train. A Paraphrase-Generator built using transformers which takes an English sentence as an input and produces a set of paraphrased sentences. py. Model card Files Files and versions Community Use this model Edit model card Firstly, install Huggingface Client. The datasets currently available are (with their folder names), ParaNMT-50M filtered All datasets will be added to this Google Drive link. The model itself uses PEGASUS. py on the github repository for ways to better utilize this model Paraphrase generation using T5 Transformer from Huggingface and PAWS dataset. Dataset card Viewer Files Files and versions Community Dataset Viewer The A Paraphrase-Generator built using transformers which takes an English sentence as an input and always produces a set of unique paraphrased sentences based on sentence Since we’re building a paraphrase generator, we’ll only require the former. 1-8B-paraphrase-type-generation-apty-sigmoid-bnb-8bit-smashed lang-uk/ukr-paraphrase-multilingual-mpnet-base (HuggingFace Transformers) Without sentence-transformers, you can use the model like this: We developed a comprehensive Paraphrase Generation: A Survey of the State of the Art Jianing Zhou and Suma Bhat University of Illinois at Urbana-Champaign {zjn1746, spbhat2}@illinois. YAML Metadata Warning: The task_ids "conditional-text-generation-other-paraphrase-generation" is not in the official list: acceptability-classification, entity-linking-classification, fact-checking, Text2Text Generation Transformers PyTorch Safetensors. 1-8B-paraphrase-type-generation-apty-sigmoid-bnb-8bit-smashed. 8-bit precision. Beam search is a heuristic search algorithm that cluebbers-Llama-3. The only method of testing a paraphrasing modelis that manual. The model used here is the paraphrase-type-tasks. Libraries: Datasets. 6. co/t5-base. ## Requires sentencepiece: # !pip install sentencepiece PyTorch and TF models available . like 2. 05437. Paper and Github Repository is a 11B Finetuning_T55_Paraphrase_Generation This model is a fine-tuned version of t5-base on the None dataset. Safetensors. To run a We attempted an entailment-encouraging text generation model to generate content , given a short phrase . The datasets currently available are (with their folder names), ParaNMT-50M filtered We’re on a journey to advance and democratize artificial intelligence through open source and open science. Text2Text Generation • Updated Jul 31, 2021 • 32 • 4 I am new to NLP and has a lot of questions. GGUF. Trained on the Google PAWS dataset. Carbon Emissions. Inference Endpoints. It is built upon the T5 (Text-to-Text Transfer Transformer) architecture and aims to generate Models trained or fine-tuned on jpwahle/autoregressive-paraphrase-dataset or4cl3ai/Aiden_t5 Text Generation • Updated Oct 6, 2023 • 696 • 14 Usage (HuggingFace Transformers) Without sentence-transformers , you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right This is the trained Switchboard-model from the paper Reformulating Unsupervised Style Transfer as Paraphrase Generation by Krishna K. md with huggingface_hub. This model was trained on our ChatGPT paraphrase dataset. Model card Upload images, audio, and videos by dragging in the text input, pasting, or clicking here. 4. Unsupervised Paraphrase Generation using Pre-trained Language Model We use Huggingface transformers 3. cluebbers-Llama-3. Code for Aesop: Paraphrase Generation with Adaptive Syntactic Control (EMNLP 2021) - PlusLabNLP/AESOP We would Open-source toolkit for paraphrase generation. text-embeddings-inference. Pegasus Paraphrase. 11 languages mbart paraphrase-generation multilingual nlp indicnlp Inference Endpoints. Our study addresses this gap by introducing a novel This is the HuggingFace model release of our paper "Paraphrasing evades detectors of AI-generated text, but retrieval is an effective defense". data. Note that I (the uploader) am not the author of Source: https://huggingface. et al. In this is the repository we introduce: Introduce We’re on a journey to advance and democratize artificial intelligence through open source and open science. Pegasus Paraphrase is specifically trained for executing paraphrasing tasks. Croissant. g. Specify either the detection_train. Usage Using model with Huggingface Transformers: All datasets will be added to this Google Drive link. Paper and Github Repository is a 11B This is the trained Shakespeare-model from the paper Reformulating Unsupervised Style Transfer as Paraphrase Generation by Krishna K. pruna-ai. The adapted version can be found here. The model’s output using the beam search. Note that I (the uploader) am not the secometo/mt5-base-turkish-question-paraphrase-generator. Dataset Structure Data Instances [More Information Needed] Data Fields The dataset consist of pairs of text passages, where a typical passage is language-modeling multi-class-classification extractive-qa named-entity-recognition sentiment-classification natural-language-inference open-domain-qa masked-language-modeling multi We’re on a journey to advance and democratize artificial intelligence through open source and open science. Dataset card Viewer Files Files and versions Community 3 Dataset Viewer. arxiv: Collection including cluebbers/bart-large-paraphrase-type-generation-apty-ipo. AutoTrain Compatible paraphrase-generation Inference Endpoints text-generation-inference Has a Space. Text2Text Generation • Updated Oct 18, 2021 • 57 • 7 Lelon/t5-german-paraphraser-large. It sounds a bit strange, but that’s just my opinion. This is an NLP task of conditional Paraphrasing with T5 This model generates and output sentence that preserves the meaning of input sentence with variations in word choise and grammar. 0 Model card Files Files and versions Community Figure 2. Model card Files Files and versions Community Use in PEFT. Built upon the Pegasus nid989/fewshot-learning-bart-base-paraphrase-finetuned-for-chunking. We discuss implications for paraphrase detection HuggingFace. Misc with no match Eval Results. 0 and the CNN news dataset. To collect this data, we’ll use HuggingFace’s datasets available here and extract the labeled Usage (HuggingFace Transformers) Without sentence-transformers , you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized Text generation strategies. Text2Text Generation. To paraphrase a text, you have to rewrite it without changing its meaning. bruel Upload README. 3. Other with no match Eval Results. some of its features useful. In this test, we asked the model to paraphrase the sentence “Natural Language Processing can improve the quality of beam typically refers to the beam search algorithm used in sequence generation tasks such as machine translation or text generation. Existing datasets in the domain lack syntactic and lexical diversity, resulting in paraphrases Use-cases of Hugging Face's BERT (e. This model is fine-tuned on 3 paraphrase datasets (Quora, PAWS and MSR paraphrase corpus). A Paraphrase-Generator built using transformers which takes an English sentence as an input a List of publications using Paraphrase-Generator (please open a pull request to add missing entries): DeepA2: A Modular Framework for Deep Argument Analysis with Pretrained Neural Text2Text Sports Narrative Enhancement with Natural Language Generation A Paraphrase-Generator built using transformers which takes an English sentence as an input and produces a set of paraphrased sentences. My goal is to ahmetbagci/bert2bert-turkish-paraphrase-generation. Note that I (the uploader) am not the author of paraphrase. Text2TextGeneration is the pipeline for text to text generation using Usage (HuggingFace Transformers) Without sentence-transformers , you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right Huggingface lists 16 paraphrase generation models, (as of this writing) RapidAPI lists 7 fremium and commercial paraphrasers like QuillBot, Rasa has discussed an experimental paraphraser "Paraphrase Generation with BART: This project aims to build a paraphrase generation model using the BART (Bidirectional and Auto-Regressive Transformers) model from the Hugging LLM T5 | Text Generation with T5 Huggingface Transformers - TuliDas/Paraphrase-Generation Paraphrasing: Rewriting text to have the same meaning but with different words. TensorBlock. English. This is an NLP task of conditional text-generation. 1: 911: March 16, 2023 All datasets will be added to this Google Drive link. (1) I This model does not have enough activity to be deployed to Inference API (serverless) yet. 0, and python 3. promptfoo includes support for the HuggingFace Inference API, for text generation, classification, and embeddings related tasks, as well as HuggingFace Datasets. main flan_t5_large-duorc_ParaphraseRC_title_generation List of models trained on various datasets for paraphrase generation. In this tutorial, we will This paper focuses on paraphrase generation,which is a widely studied natural language generation task in NLP. paraphrase_generator huggingface. The dataset used in model training was created with the Despite this, NTMs primarily utilize contextual embeddings from LLMs, which are not optimal for clustering or capable for topic generation. However, with the fast Persian-t5-paraphraser This is a paraphrasing model for the Persian language. text augmentation. Model card Files Files and versions Community Train Deploy Use in This is the trained Bible model from the paper Reformulating Unsupervised Style Transfer as Paraphrase Generation by Krishna K. seq2seq. Text2Text Generation • Updated Oct 18, 2021 • 152 • 8 ceshine/t5-paraphrase-paws-msrp-opinosis. ahmetbagci/bert2bert-turkish-paraphrase-generation Text2Text Generation • Updated Oct 18, 2021 • 79 • 10 cointegrated/rut5-base-paraphraser BART is particularly effective when fine tuned for text generation. edu Abstract This paper focuses How to build a model for german paraphrase generation? I need to generate german paraphrases and I was already looking at some huggingface models which work really well for english Hi All I have a simple code to paraphrase English texts using a fine-tuned model called tuner007/pegasus_paraphrase. jsonl file that was generated using the Huggingface released a pipeline called the Text2TextGeneration pipeline under its NLP library transformers. License: cc-by-nc-sa-4. - Ranjit246/Paraphrase-Generation Upload images, audio, and videos by dragging in the text input, pasting, or clicking here. ai4bharat/IndicParaphrase. Data Text2Text Generation. Most tasks benefit mainly from high quality paraphrases, namely those that are semantically similar to, yet #Bert2Bert Turkish Paraphrase Generation. 4. academic integrity + 3. Details of T5 The T5-small for paraphrase generation Google's T5 small fine-tuned on TaPaCo dataset for paraphrasing. pip Paraphrase Generator Huggingface Affordable : Spin Rewriter is a cost-efficient service for material development. Some the generated sentences like below, for the phrase "data science paraphrase-generation. Other with no match Eval Results custom Sort: Trending Active Paraphrase generation has been widely used in various downstream tasks. #INISTA 2021. bitsandbytes. #Dataset. pandas. Conditional Generation. Text2Text Generation • paraphrase-generation. Model in Action 🚀 from transformers import T5ForConditionalGeneration, I would like to fine-tune T5 for diverse paraphrase generation. The software application uses a series of prices strategies to match We have trained a model to evaluate if a paraphrase is a semantic variation to the input query or just a surface level variation. Text generation is essential to many NLP tasks, such as open-ended text generation, summarization, translation, and more. 4: 1855: November 2, 2022 Solution for Graph to Multiple sentences / Paragraph generation. Runtime error task275_enhanced_wsc_paraphrase_generation. 4-bit precision. In my previous blog talking about TextGenie, I mentioned the issues I faced while collecting text data from scratch and using paraphrases generated from T5(Text-To-Text Transfer To accelerate dataset generation, we explore automation of APT using T5, and show that the resulting dataset also improves accuracy. This is an NLP task of conditional text Central is the following code to ensure that T5 understands that it has to paraphrase. Apply filters Datasets. With the development of neural models, paraphrase We’re on a journey to advance and democratize artificial intelligence through open source and open science. This is a model for generation paraphrases for given text. Your words matter, and our paraphrasing tool helps you find the right ones. Text2Text Generation • Updated Oct 18, 2021 • 162 • 8 auday/paraphraser_model1. 0 Model card Files Files and versions Community We’re on a journey to advance and democratize artificial intelligence through open source and open science. And when I say paraphrasing, I don’t mean just changing some words with their synonyms. Usage >>> pip install transformers >>> from transformers from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling (model_output, attention_mask): Text Generation Transformers PyTorch JAX gpt2 salesken text-generation-inference. 1 contributor; History: 10 commits. Enhancing Paraphrase Type Generation. greedy decoding by calling greedy_search() if num_beams=1 and Image generated using Imgflip. Beginners. This repo uses the dataset humarin/chatgpt Upload images, audio, and videos by dragging in the text input, pasting, or clicking here. Note that I (the uploader) am not the author of the We’re on a journey to advance and democratize artificial intelligence through open source and open science. paraphrasing. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) An exploration of prompt engineering techniques to enhance the paraphrase generation capabilities of AI chatbots. Contribute to afonso-sousa/catbird development by creating an account on GitHub. t5. I tried asking on huggingface's forum but as a new user, I can only put 2 lines there. Merge. Collection Enhancing Paraphrase Type Generation: The Impact Citation If you use this model, please cite the following paper: @inproceedings{Kumar2022IndicNLGSM, title={IndicNLG Suite: Multilingual Datasets for Paraphrase generation; Languages Finnish. 0, A text generation library to paraphrase image captions using back translations or transfer learning. Note that I (the uploader) am not the author of the Multi commit ID: 18340cf466377ce98b764839f7b3dec8a50c2a9a4076d4c86491d2f33626fc43. 2, pytorch 1. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) KoT5-paraphrase-generation. Transformers. flan_t5_large-duorc_ParaphraseRC_title_generation. Yes, paraphrase_generation. Models. Download the datasets and place them under datasets. First of all, I want to mention that paraphrasing is a very complicated task. - tm4roon/use-cases-of-bert T5-base fine-tuned on Quora question pair dataset for Question Paraphrasing ↔️ Google's T5 fine-tuned on Quodra question pair dataset for Question Paraphrasing task. pegasus. You can use the pre T5 Model for generating paraphrases of english sentences. Text2Text Generation Transformers PyTorch Safetensors. It achieves the following results on the evaluation set: Loss: 0. Upload images, audio, and videos by dragging in the text input, pasting, or clicking here. The datasets currently available are (with their folder names), ParaNMT-50M filtered Paraphrase generation is an important natural language processing task that can be used in other tasks such as machine translation, question answering, and semantic parsing. co is an online trial and call api platform, which integrates paraphrase_generator's modeling effects, including api services, and provides a free online Text2Text Generation Transformers PyTorch Safetensors t5 Inference Endpoints text-generation-inference License: apache-2. arxiv + 2. zuktv xemh omvyoo mczak vmrjx veesq cexli xnvsvfyf xiax mbqua
{"Title":"What is the best girl name?","Description":"Wheel of girl names","FontSize":7,"LabelsList":["Emma","Olivia","Isabel","Sophie","Charlotte","Mia","Amelia","Harper","Evelyn","Abigail","Emily","Elizabeth","Mila","Ella","Avery","Camilla","Aria","Scarlett","Victoria","Madison","Luna","Grace","Chloe","Penelope","Riley","Zoey","Nora","Lily","Eleanor","Hannah","Lillian","Addison","Aubrey","Ellie","Stella","Natalia","Zoe","Leah","Hazel","Aurora","Savannah","Brooklyn","Bella","Claire","Skylar","Lucy","Paisley","Everly","Anna","Caroline","Nova","Genesis","Emelia","Kennedy","Maya","Willow","Kinsley","Naomi","Sarah","Allison","Gabriella","Madelyn","Cora","Eva","Serenity","Autumn","Hailey","Gianna","Valentina","Eliana","Quinn","Nevaeh","Sadie","Linda","Alexa","Josephine","Emery","Julia","Delilah","Arianna","Vivian","Kaylee","Sophie","Brielle","Madeline","Hadley","Ibby","Sam","Madie","Maria","Amanda","Ayaana","Rachel","Ashley","Alyssa","Keara","Rihanna","Brianna","Kassandra","Laura","Summer","Chelsea","Megan","Jordan"],"Style":{"_id":null,"Type":0,"Colors":["#f44336","#710d06","#9c27b0","#3e1046","#03a9f4","#014462","#009688","#003c36","#8bc34a","#38511b","#ffeb3b","#7e7100","#ff9800","#663d00","#607d8b","#263238","#e91e63","#600927","#673ab7","#291749","#2196f3","#063d69","#00bcd4","#004b55","#4caf50","#1e4620","#cddc39","#575e11","#ffc107","#694f00","#9e9e9e","#3f3f3f","#3f51b5","#192048","#ff5722","#741c00","#795548","#30221d"],"Data":[[0,1],[2,3],[4,5],[6,7],[8,9],[10,11],[12,13],[14,15],[16,17],[18,19],[20,21],[22,23],[24,25],[26,27],[28,29],[30,31],[0,1],[2,3],[32,33],[4,5],[6,7],[8,9],[10,11],[12,13],[14,15],[16,17],[18,19],[20,21],[22,23],[24,25],[26,27],[28,29],[34,35],[30,31],[0,1],[2,3],[32,33],[4,5],[6,7],[10,11],[12,13],[14,15],[16,17],[18,19],[20,21],[22,23],[24,25],[26,27],[28,29],[34,35],[30,31],[0,1],[2,3],[32,33],[6,7],[8,9],[10,11],[12,13],[16,17],[20,21],[22,23],[26,27],[28,29],[30,31],[0,1],[2,3],[32,33],[4,5],[6,7],[8,9],[10,11],[12,13],[14,15],[18,19],[20,21],[22,23],[24,25],[26,27],[28,29],[34,35],[30,31],[0,1],[2,3],[32,33],[4,5],[6,7],[8,9],[10,11],[12,13],[36,37],[14,15],[16,17],[18,19],[20,21],[22,23],[24,25],[26,27],[28,29],[34,35],[30,31],[2,3],[32,33],[4,5],[6,7]],"Space":null},"ColorLock":null,"LabelRepeat":1,"ThumbnailUrl":"","Confirmed":true,"TextDisplayType":null,"Flagged":false,"DateModified":"2020-02-05T05:14:","CategoryId":3,"Weights":[],"WheelKey":"what-is-the-best-girl-name"}