-
Rekimoto Lab / the University of Tokyo
- Bunkyo, Tokyo, Japan
- https://nawta.github.io/
- https://orcid.org/0000-0001-9966-4664
- @Cvadogsan
- in/naotonishida
- https://nawta.hatenadiary.com/
- nawta.ig
Highlights
- Pro
Block or Report
Block or report nawta
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseLists (1)
Sort Name ascending (A-Z)
Stars
Language
Sort by: Recently starred
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
Perform real-time transcription using faster-whisper.
Fast inference engine for Transformer models
Faster Whisper transcription with CTranslate2
Real time transcription with OpenAI Whisper.
Transcription, forced alignment, and audio indexing with OpenAI's Whisper
ROS2 node for DJI Tello and Visual SLAM for mapping of indoor environments.
Simple Online Realtime Tracking with a Deep Association Metric
Using this repository, you can identify people in the video by data into a SQLite database, and re-identifying them whenever they appear in subsequent videos
Experimental projects we've done with DepthAI.
Extensive acceptance rates and information of main AI conferences
🚀 Catalyst is a C# Natural Language Processing library built for speed. Inspired by spaCy's design, it brings pre-trained models, out-of-the box support for training word and document embeddings, a…
SpacyDotNet is a .NET wrapper for the popular natural language library spaCy
The Unreliability of Explanations in Few-shot Prompting for Textual Reasoning (NeurIPS 2022)
🐍 mecab-python. you can find original version here:http:https://taku910.github.io/mecab/
Neologism dictionary based on the language resources on the Web for mecab-ipadic
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Automatic tiling window manager for macOS à la xmonad.
A tiling window manager for macOS based on binary space partitioning
Reference implementation for DPO (Direct Preference Optimization)
Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah
Unofficial implementation of iTransformer - SOTA Time Series Forecasting using Attention networks, out of Tsinghua / Ant group
This is a repository for submitted contents of Qiita.