In addition to my attendence at ICDM25, I was honored to attend the conference on Neural Information Processing Systems (NeurIPS) this month to present OPTiCAL a second time. Like ICDM, I will be making blog posts about the academic highlights of the conference and my time in San Diego.
Posts about the following highlights of my trip will appear on this website.
A travel blog about my time in Washington D.C. including the places I visited, the food I ate, and my journey to and from the conference.
The papers, workshops and tutorials that I found most inspiring and impressive will get their own blogpost.
The UF Data Studio's contribution to NeurIPS was a second presentation of OPTiCAL: An Abstract Positional Reasoning Benchmark for Vision Language Models. A post summarizing the paper will appear soon.
I attended NeurIPS to present OPTiCAL: An Abstract Positional Reasoning Benchmark for Vision Language Models at the Evaluating the Evolving LLM Lifecycle workshop. I summarized the findings of that work in the overview of my posts for that conference, but I will reproduce the summary here.
The official paper should be available to read on the ICDM25 workshop proceedings, which will be released soon on IEEE Explore. Our code and the Shapes30k dataset are available on GitHub. Feel free to reach out to the authors with questions about the work.
We received invaluable feedback on the present OPTiCAL from the community of multimodal AI researchers at MMAI and other colleagues at ICDM25. If you're working on multimodal AI, and VLM evaluation especially, we'd love to hear from you. Please contact me through the channels available on our webpage.
For more information about our research, return to our homepage: ufdatastudio.com.