Search Results for author: Yilin Ye

Found 5 papers, 3 papers with code

Generative AI for Visualization: State of the Art and Future Directions

no code implementations28 Apr 2024 Yilin Ye, Jianing Hao, Yihan Hou, Zhan Wang, Shishi Xiao, Yuyu Luo, Wei Zeng

From a technical perspective, this paper looks back on previous visualization studies leveraging GenAI and discusses the challenges and opportunities for future research.

Graph Generation Language Modelling +1

The Contemporary Art of Image Search: Iterative User Intent Expansion via Vision-Language Model

no code implementations4 Dec 2023 Yilin Ye, Qian Zhu, Shishi Xiao, Kang Zhang, Wei Zeng

Moreover, the intent expansion framework enables users to perform flexible contextualized interactions with the search results to further specify or adjust their detailed search intents iteratively.

Image Retrieval Interactive Segmentation +2

TimeTuner: Diagnosing Time Representations for Time-Series Forecasting with Counterfactual Explanations

1 code implementation19 Jul 2023 Jianing Hao, Qing Shi, Yilin Ye, Wei Zeng

Deep learning (DL) approaches are being increasingly used for time-series forecasting, with many efforts devoted to designing complex DL models.

counterfactual Feature Engineering +4

Let the Chart Spark: Embedding Semantic Context into Chart with Text-to-Image Generative Model

1 code implementation28 Apr 2023 Shishi Xiao, Suizi Huang, Yue Lin, Yilin Ye, Wei Zeng

Pictorial visualization seamlessly integrates data and semantic context into visual representation, conveying complex information in a manner that is both engaging and informative.

text-guided-generation

Everyone Can Be Picasso? A Computational Framework into the Myth of Human versus AI Painting

1 code implementation17 Apr 2023 Yilin Ye, Rong Huang, Kang Zhang, Wei Zeng

The recent advances of AI technology, particularly in AI-Generated Content (AIGC), have enabled everyone to easily generate beautiful paintings with simple text description.

Cannot find the paper you are looking for? You can Submit a new open access paper.