Researcher Archives: Frames Catherine White

AbstractDifferentiation.jl: Backend-Agnostic Differentiable Programming in Julia

(Best Poster Award)   No single Automatic Differentiation (AD) system is the optimal choice for all problems. This means informed selection of an AD system and combinations can be a problem-specific variable that can greatly impact performance. In the Julia programming language, the major AD systems target the same input and thus in theory can […]

WEmbSim: A Simple yet Effective Metric for Image Captioning

(DSTG Best Contribution to Science Award)   The area of automatic image caption evaluation is still undergoing intensive research to address the needs of generating captions which can meet adequacy and fluency requirements. Based on our past attempts at developing highly sophisticated learning-based metrics, we have discovered that a simple cosine similarity measure using the […]

WordTokenizers.jl: Basic tools for tokenizing natural language in Julia

WordTokenizers.jl is a tool to help users of the Julia programming language (Bezanson, Edelman, Karpinski, & Shah, 2014) work with natural language. In natural language processing (NLP) tokenization refers to breaking a text up into parts – the tokens. Generally, tokenization refers to breaking a sentence up into words and other tokens such as punctuation. […]

Meta-Optimization of Optimal Power Flow

The planning and operation of electricity grids is carried out by solving various forms of con- strained optimization problems. With the increas- ing variability of system conditions due to the integration of renewable and other distributed en- ergy resources, such optimization problems are growing in complexity and need to be repeated daily, often limited to a 5 minute solve-time. To address this, we propose a meta-optimizer that is used to initialize interior-point solvers. This can significantly reduce the number of iterations to converge to optimality.