Comments on: BERT Explained – A list of Frequently Asked Questions https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/ A blog on data science, machine learning and artificial intelligence. Sun, 14 Jun 2020 07:33:29 +0000 hourly 1 http://wordpress.com/ By: Abdellah Frindou https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/comment-page-1/#comment-15419 Sat, 01 Feb 2020 18:42:54 +0000 http://yashuseth.blog/?p=552#comment-15419 i have a question, is it possible to train such a model to make classification text and question-answering task at the same time. which means give the input as pair of [CLS]question[SEP]paragraph and the output as category and answer?

Like

]]>
By: Modèle d'intégration de mots : Word2vec, Camembert et USE | Le Blog de Baamtu https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/comment-page-1/#comment-15407 Tue, 28 Jan 2020 10:52:59 +0000 http://yashuseth.blog/?p=552#comment-15407 […] https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/ […]

Like

]]>
By: tam https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/comment-page-1/#comment-15369 Tue, 14 Jan 2020 09:53:07 +0000 http://yashuseth.blog/?p=552#comment-15369 Great Post! Nicely summarized the paper. Thankyou.

Like

]]>
By: Sudharsan Kumar https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/comment-page-1/#comment-15249 Sat, 30 Nov 2019 05:27:48 +0000 http://yashuseth.blog/?p=552#comment-15249 Nice Post, Explain very naturally about Bert working in this post.

Like

]]>
By: ILLUSTRATION DE BERT – lbourdois https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/comment-page-1/#comment-15242 Wed, 27 Nov 2019 09:43:10 +0000 http://yashuseth.blog/?p=552#comment-15242 […] suivant fait un récapitulatif des informations liés à BERT sous la forme d’une F.A.Q : https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/ (en […]

Like

]]>
By: learning_bnert https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/comment-page-1/#comment-15173 Tue, 05 Nov 2019 12:47:36 +0000 http://yashuseth.blog/?p=552#comment-15173 “The final hidden states (the transformer outputs) of the input tokens can be concatenated and / or pooled together to get the encoded representation of a sentence.”

did you mean the CLS token here or the tuple(input sequence+poold sequence ) token here ?

how to do concatenations nd pooling together ?

Like

]]>
By: Alex https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/comment-page-1/#comment-15091 Tue, 08 Oct 2019 17:14:46 +0000 http://yashuseth.blog/?p=552#comment-15091 Could you add more on training tasks: Masked Language Modeling and Next Sentence Prediction? Was BERT trained consecutively on them or in parallel? They would probably require different architecture to be trained in parallel?

Like

]]>
By: dennymarcels https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/comment-page-1/#comment-15031 Thu, 03 Oct 2019 17:53:31 +0000 http://yashuseth.blog/?p=552#comment-15031 I’ve been reading on BERT for the last two weeks straight and this is the most concise and informative summary I found. Congratulations and thank you!

Like

]]>
By: Yashu Seth https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/comment-page-1/#comment-14270 Fri, 12 Jul 2019 16:47:13 +0000 http://yashuseth.blog/?p=552#comment-14270 In reply to Wenyi Tao.

Thanks Wenyi!!

Like

]]>
By: Wenyi Tao https://yashuseth.blog/2019/06/12/bert-explained-faqs-understand-bert-working/comment-page-1/#comment-14269 Fri, 12 Jul 2019 04:41:44 +0000 http://yashuseth.blog/?p=552#comment-14269 This is a great post! Thanks for sharing

Like

]]>