Share it with your friends Like

Thanks! Share it with your friends!

Close

A word embedding is a learned representation for text where words that have the same meaning have a similar representation. It is this approach to representing words and documents that may be considered one of the key breakthroughs of deep learning on challenging natural language processing problems.

Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
https://www.youtube.com/channel/UCNU_lfiiWBdtULKOw6X0Dig/join

Please do subscribe my other channel too
https://www.youtube.com/channel/UCjWY5hREA6FFYrthD0rZNIw

If you want to Give donation to support my channel, below is the Gpay id
GPay: krishnaik06@okicici

Connect with me here:

Twitter: https://twitter.com/Krishnaik06

Facebook: https://www.facebook.com/krishnaik06

instagram: https://www.instagram.com/krishnaik06

Buy/Stream:

Comments

Klien Mañago says:

Good day, may I ask how to define specific dimensions of features (for example, I want to extract linguistic features such as part of speech tagging, word density, and word frequency) that is going to be vectorized?

Mohammad Kazemi Beydokhti says:

Thanks for the tutorial on word embedding models. I wonder how are features selected in these models? I think in some particular cases having control on customizing these feature might enhance the chance of getting more similar words than just using the pretarined ones.

Nazwa Kurde says:

thanks this is awesome

José Roberto Homeli da Silva says:

unfortunately, I cannot like your video 300 times <3

louer le seigneur says:

Thanks Krish

Usashi Chatterjee says:

Sir your explanations are fantastic

Utkarsh says:

His channel and Statquest are 2 of the best resources for ML and Data Science on Youtube.

Suketu Dave says:

Pen Pineapple Apple Pen 🙂

ROHIT MONDAL says:

good explanation sir

Muhammad Noman Khan, Assistant Professor, Department of Journalism and Mass Communication, UoP says:

Excellent tutorials .

suvarna deore says:

Thank you sir

Aditya Chauhan says:

MAKING NO ONE ANY WISER.

Ghizlane BOUSKRI says:

Krish, you save my life every time

Yang Li says:

you are 200% better than my professor in explaining Word Embedding

trex midnite says:

How to convert man to woman?

MavaaMusicMachine says:

Amazing explanation thank you!

R_Py Data Science says:

I don’t really get word embedding. Does it work outside the mainstream English language? For example medical language is different. If I am studying about medical literature, a lot of my main vocabularies are medical words. What is your opinion on this?

Dharmendra Thakur says:

I have one doubt can you make it clear, my doubt is how it is assigning the values for opposite genders, like for Boy -1 and for girl 1

Dharmendra Thakur says:

You have done a great job, There are students like me, who really need such explannation for getting a rough image…atleast thanks a lot

Sandeep says:

Many thanks

Golden Water says:

I wish you could shorten the videos .

Dinesh Babu says:

someone please explain me what is vocabulary and vocabulary size . I'm confused with these terms.

Patrick Adjei says:

I don't usually subscribe to youtube channels … but this first video I watched from you got me.

Write a comment

*

DARPA SUPERHIT 2021 Play Now!Close

DARPA SUPERHIT 2021

(StoneBridge Mix)

Play Now!

×