comment: <> (title: “Sitemap”)
Page Not Found
Page not found. Your pixels are in another canvas.
)
About me
About me
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
Future Post
Published:
comment: <> (title: ‘Future Blog Post’)
comment: <> (date: 2199-01-01)
comment: <> (permalink: /posts/2012/08/blog-post-4/)
comment: <> (tags:)
comment: <> ( - cool posts)
comment: <> ( - category1)
comment: <> ( - category2)
comment: <> (—)
comment: <> (This post will show up by default. To disable scheduling of future posts, edit config.yml
and set future: false
. )
)
Blog Post 4
Published:
comment: <> (title: ‘Blog Post number 4’)
comment: <> (date: 2015-08-14)
comment: <> (permalink: /posts/2012/08/blog-post-4/)
comment: <> (tags:)
comment: <> ( - cool posts)
comment: <> ( - category1)
comment: <> ( - category2)
comment: <> (—)
comment: <> (This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.)
comment: <> (Headings are cool)
comment: <> (======)
comment: <> (You can have many headings)
comment: <> (======)
comment: <> (Aren’t headings cool?)
comment: <> (——)
)
Blog Post 3
Published:
comment: <> (title: ‘Blog Post number 3’)
comment: <> (date: 2014-08-14)
comment: <> (permalink: /posts/2014/08/blog-post-3/)
comment: <> (tags:)
comment: <> ( - cool posts)
comment: <> ( - category1)
comment: <> ( - category2)
comment: <> (—)
comment: <> (This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool. )
comment: <> (Headings are cool)
comment: <> (======)
comment: <> (You can have many headings)
comment: <> (======)
comment: <> (Aren’t headings cool?)
comment: <> (——)
)
Blog Post number 2
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
)
Blog Post number 1
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
)
A Machine Learning Model Blind-Spot Detection System and Method
Maor Ivgi and Yuval Dafna. Submission: U.S. Provisional Patent Application No. 63/170,517 f.
Efficiently detecting blind-spots for black-box machine learning models.
)
Scene Graph to Image Generation with Contextualized Object Layout Refinement
Maor Ivgi, Yaniv Benny, Avichai Ben-David, Jonathan Berant, and Lior Wolf. Published in ICIP 2021
A novel approach to gradually generate realistic image layouts from scene-graphs by attending to all objects in the generated layout simultaneously.
)
Achieving Model Robustness through Discrete Adversarial Training
Maor Ivgi and Jonathan Berant. Published in EMNLP 2021
While many works established that modern transformer-based NLP models are not robust, this work is all about achieving this lost robustness back.
)
Beyond Importance Scores: Interpreting Tabular ML by Visualizing Feature Semantics
Amirata Ghorbani, Dina Berenbaum, Maor Ivgi, Yuval Dafna and James Zou. Published in MDPI (vol. 13), 2021
A novel way to visualize not only the importance of each feature in tabular data, but also the semantic meaning and relationships of features.
)
SCROLLS: Standardized CompaRison Over Long Language Sequences
Uri Shaham, Elad Segal, Maor Ivgi, Avia Efrat, Ori Yoran, Adi Haviv, Ankit Gupta, Wenhan Xiong, Mor Geva, Jonathan Berant, Omer Levy. Published in EMNLP 2022
SCROLLS is a suite of datasets that require synthesizing information over long texts. The benchmark includes seven natural language tasks across multiple domains, including summarization, question answering, and natural language inference.
)
Scaling Laws Under the Microscope: Predicting Transformer Performance from Small Scale Experiments
Maor Ivgi, Yair Carmon, Jonathan Berant. Published in Findings of EMNLP 2022
Scaling laws are undoubtedly fascinating, but can they be harnessed for efficient model design? In this work, we explore their usefulness across a variety of language understanding tasks, and show that in some cases, they can!
)
Efficient Long-Text Understanding with Short-Text Models
Maor Ivgi, Uri Shaham, Jonathan Berant. Published in TACL 2023, will be presented in ACL 2023
Can short-range LMs perform long-range reasoning? They can!
In this work, we propose the SLiding-Encoder and Decoder (SLED) which leverages existing battle-proven enc-dec LMs to operate over long-range NLU tasks.
)
DoG is SGD’s Best Friend: A Parameter-Free Dynamic Step Size Schedule
Maor Ivgi, Oliver Hinder, Yair Carmon. Published in ICML (2023)
DoG is a tuning-free dynamic SGD step size formula, backed by strong theoretical guarantees and empirically demonstrated over many domains and model-architectures to achieve comparable results to well-tuned SGD with best-practice learning-rate schedule.
)
ZeroSCROLLS: A Zero-Shot Benchmark for Long Text Understanding
Uri Shaham, Maor Ivgi, Avia Efrat, Jonathan Berant, Omer Levy. Published in Findings of EMNLP 2023
ZeroSCROLLS is a suite of datasets that require synthesizing information over long texts. The benchmark includes ten natural language tasks across multiple domains, including summarization, question answering, aggregated sentiment classification and information reordering.
)
Accelerated Parameter-Free Stochastic Optimization
Itai Kreisler, Maor Ivgi, Oliver Hinder, Yair Carmon. Published in Arxiv preprint
Building on the DoG optimizer, ADoG is a tuning-free dynamic accelerated SGD step size formula, backed by strong theoretical guarantees and empirically demonstrated to work well in the convex settings.
)
Natural Langauge Processing, fall 2022
Undergraduate course, Tel-Aviv University, School of CS
Start date: 2023-05-01
Advanced undergraduate course in Natural Language Processing, covering topics from classic unigram methods to most advanced large pretrained language models.
Syllabus
Natural Language Processing (NLP) aims to develop methods for processing, analyzing and understanding natural language. The goal of this class is to provide a thorough overview of modern methods in the field of Natural Language Processing. The class will not assume prior knowledge in NLP. Among others, we will cover word representations, language modeling, sequence models, and self-supervision, focusing on the interface between structured prediction and deep learning. Course website
)
Natural Langauge Processing, fall 2023
Undergraduate course, Tel-Aviv University, School of CS
Start date: 2024-01-01
Advanced undergraduate course in Natural Language Processing, covering topics from classic unigram methods to most advanced large pretrained language models, instruction finetuning and alignment.
Syllabus
Natural Language Processing (NLP) aims to develop methods for processing, analyzing and understanding natural language. The goal of this class is to provide a thorough overview of modern methods in the field of Natural Language Processing. The class will not assume prior knowledge in NLP. Among others, we will cover word representations, language modeling, sequence models, and self-supervision, focusing on the interface between structured prediction and deep learning. Course website
)
Natural Langauge Processing, spring 2023
Undergraduate course, Tel-Aviv University, School of CS
Start date: 2024-05-01
Advanced undergraduate course in Natural Language Processing, covering topics from classic unigram methods to most advanced large pretrained language models, instruction finetuning and alignment. In this class students learn to read papers from the frontier of the field and implement them in practice, as well as develop their own research projects in the field of NLP.
Syllabus
Natural Language Processing (NLP) aims to develop methods for processing, analyzing and understanding natural language. The goal of this class is to provide a thorough overview of modern methods in the field of Natural Language Processing. The class will not assume prior knowledge in NLP. Among others, we will cover word representations, language modeling, sequence models, and self-supervision, focusing on the interface between structured prediction and deep learning. Course website
)