-
Notifications
You must be signed in to change notification settings - Fork 0
word vectors
-doc embeds via add comp model
->bc simple
-compute cos(docs, queries)
->works q well
-comp sim b/w doc/queries acc to cos()
-!{...}
-better to use cf not
-doc+query via diff langs
-!+helps monoling lang's as well
-docs+queries live in diff langs
-...is res. w/bridging lang?
->eng well conn=>other lang
-...does eng bridging lang help?
-..>Y, shared task in cross-ling. IR
-!not so huge anymore, just a nice toy app.
->=this simple model
-...-
-idea, you can do more adv. approaches using embeddings
-+another appl I like is word embedding in diff langs,
rank words in issued word/source lang query
-+pretty much all word embeddings evaluated on this
-:1 interesting thing, typ2 models can outperform {...}
-...improvement not spectacular
-§Q:"#?"
-§A:Not so many/not so bad
-(...)70% acc
-one of probs in sub-community is don't have ways to [...eval?]
-eng-de quite close as lang. pair
-ppl pub contradict results
->in every paper diff results/scorings
-people improve 95%=>95.4%, another has 95.6
->unconvinced good x-ling. eval
-further q's what to do w/x-ling embeddings
-...phrase-level{&c}
-instead of conclusion here is summary slide
-..."(Intrinsic)..."
---Q&A---
-disc this week: what to be other task, reach other domains and see if something could be done
-...x-ling lexicon attainment
-...more like Q to whole community
-...should do more interesting questions like cross-ling.{...}
-study on using NMT embeds to improve sem sim for eng
-...each model 0.5 on sim lex
-!not sure how compare
-...1 prob is need Pearl data(framework[other/own]
-..love'.'{...}
-§these words still compared against when many ideas sim-in fact same main principle-hugely resembles LSI
-=>learn word vecs,
-...compare skip-gram vecs with LSI
-higher for skip-gram
-2 sides same coin 'imo'
-...roots in psych as LSI dev in psych
-representation of brain
-...same idea ! implemented same way
-§any essence learning research
-=>Platonic way?
-...essence of word: king minus man + woman = queen
-lang independent space
-main assumptions of biling embedding space
-cat in corresp embed space
-...same pt in space interling.?
)
-...good Q, have to think about
-§as curiosity, when remove need for parallel corpus, have you tried triling model?
-=>1 paper reported enrichment with other model, comb with other langs
-...e(x) as more quality supervision
-...unsure abt context but would e(x)
-§has anyone ever put into generative model?
-...see what spits out in Spanish
-=>talking abt comb 2,
-work on Bayesian approach
-topic model, embeddings,
-!unaware of {...}
-typ4 popular published results?
-=>think have plenty papers using this type methodology in work
-eg
Relational vector logic model pre-tendent upon essence
-all similar words are family relations -essence of word