comment: <> (title: “Sitemap”)

)

)

)

)

)

)

)

)

)

)

)

)

)

)

)

)

)

)

)

)

)

Future Post

less than 1 minute read

Published:

comment: <> (title: ‘Future Blog Post’)
comment: <> (date: 2199-01-01)
comment: <> (permalink: /posts/2012/08/blog-post-4/)
comment: <> (tags:)
comment: <> ( - cool posts)
comment: <> ( - category1)
comment: <> ( - category2)
comment: <> (—)
comment: <> (This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false. )

)

Blog Post 4

less than 1 minute read

Published:

comment: <> (title: ‘Blog Post number 4’)
comment: <> (date: 2015-08-14)
comment: <> (permalink: /posts/2012/08/blog-post-4/)
comment: <> (tags:)
comment: <> ( - cool posts)
comment: <> ( - category1)
comment: <> ( - category2)
comment: <> (—)
comment: <> (This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.)
comment: <> (Headings are cool)
comment: <> (======)
comment: <> (You can have many headings)
comment: <> (======)
comment: <> (Aren’t headings cool?)
comment: <> (——)

)

Blog Post 3

less than 1 minute read

Published:

comment: <> (title: ‘Blog Post number 3’)
comment: <> (date: 2014-08-14)
comment: <> (permalink: /posts/2014/08/blog-post-3/)
comment: <> (tags:)
comment: <> ( - cool posts)
comment: <> ( - category1)
comment: <> ( - category2)
comment: <> (—)
comment: <> (This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool. )
comment: <> (Headings are cool)
comment: <> (======)
comment: <> (You can have many headings)
comment: <> (======)
comment: <> (Aren’t headings cool?)
comment: <> (——)

)

Blog Post number 2

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

)

Blog Post number 1

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

)

)

)

)

)

SCROLLS: Standardized CompaRison Over Long Language Sequences

Uri Shaham, Elad Segal, Maor Ivgi, Avia Efrat, Ori Yoran, Adi Haviv, Ankit Gupta, Wenhan Xiong, Mor Geva, Jonathan Berant, Omer Levy. Published in EMNLP 2022

SCROLLS is a suite of datasets that require synthesizing information over long texts. The benchmark includes seven natural language tasks across multiple domains, including summarization, question answering, and natural language inference.

)

)

Efficient Long-Text Understanding with Short-Text Models

Maor Ivgi, Uri Shaham, Jonathan Berant. Published in TACL 2023, will be presented in ACL 2023

Can short-range LMs perform long-range reasoning? They can!
In this work, we propose the SLiding-Encoder and Decoder (SLED) which leverages existing battle-proven enc-dec LMs to operate over long-range NLU tasks.

)

DoG is SGD’s Best Friend: A Parameter-Free Dynamic Step Size Schedule

Maor Ivgi, Oliver Hinder, Yair Carmon. Published in ICML (2023)

DoG is a tuning-free dynamic SGD step size formula, backed by strong theoretical guarantees and empirically demonstrated over many domains and model-architectures to achieve comparable results to well-tuned SGD with best-practice learning-rate schedule.

)

ZeroSCROLLS: A Zero-Shot Benchmark for Long Text Understanding

Uri Shaham, Maor Ivgi, Avia Efrat, Jonathan Berant, Omer Levy. Published in Findings of EMNLP 2023

ZeroSCROLLS is a suite of datasets that require synthesizing information over long texts. The benchmark includes ten natural language tasks across multiple domains, including summarization, question answering, aggregated sentiment classification and information reordering.

)

Accelerated Parameter-Free Stochastic Optimization

Itai Kreisler, Maor Ivgi, Oliver Hinder, Yair Carmon. Published in Arxiv preprint

Building on the DoG optimizer, ADoG is a tuning-free dynamic accelerated SGD step size formula, backed by strong theoretical guarantees and empirically demonstrated to work well in the convex settings.

)

Natural Langauge Processing, fall 2022

Undergraduate course, Tel-Aviv University, School of CS
Start date: 2023-05-01

Advanced undergraduate course in Natural Language Processing, covering topics from classic unigram methods to most advanced large pretrained language models.

Syllabus

Natural Language Processing (NLP) aims to develop methods for processing, analyzing and understanding natural language. The goal of this class is to provide a thorough overview of modern methods in the field of Natural Language Processing. The class will not assume prior knowledge in NLP. Among others, we will cover word representations, language modeling, sequence models, and self-supervision, focusing on the interface between structured prediction and deep learning. Course website

)

Natural Langauge Processing, fall 2023

Undergraduate course, Tel-Aviv University, School of CS
Start date: 2024-01-01

Advanced undergraduate course in Natural Language Processing, covering topics from classic unigram methods to most advanced large pretrained language models, instruction finetuning and alignment.

Syllabus

Natural Language Processing (NLP) aims to develop methods for processing, analyzing and understanding natural language. The goal of this class is to provide a thorough overview of modern methods in the field of Natural Language Processing. The class will not assume prior knowledge in NLP. Among others, we will cover word representations, language modeling, sequence models, and self-supervision, focusing on the interface between structured prediction and deep learning. Course website

)

Natural Langauge Processing, spring 2023

Undergraduate course, Tel-Aviv University, School of CS
Start date: 2024-05-01

Advanced undergraduate course in Natural Language Processing, covering topics from classic unigram methods to most advanced large pretrained language models, instruction finetuning and alignment. In this class students learn to read papers from the frontier of the field and implement them in practice, as well as develop their own research projects in the field of NLP.

Syllabus

Natural Language Processing (NLP) aims to develop methods for processing, analyzing and understanding natural language. The goal of this class is to provide a thorough overview of modern methods in the field of Natural Language Processing. The class will not assume prior knowledge in NLP. Among others, we will cover word representations, language modeling, sequence models, and self-supervision, focusing on the interface between structured prediction and deep learning. Course website

)