Local computation and reducibility
Technical Report, Computer Science Division, UC Berkeley, May 17, 2006
In my doctoral dissertation I studied and established fundamental limitations for sampling-based algorithms. This work also developed a notion of reducibility among algorithms that can only “look” at small pieces of the problem and established relationships between this class of algorithms and other areas in combinatorial optimization.
Approximate max-integral-flow/min-multicut theorems
ACM Symposium on Theory of Computing, June 13, 2004
I proved a fundamental theorem in network optimization bounding the ratio of the maximum multicommodity flow to the minimum multicut of a graph in the case of discrete or integral network flows. Prior to this work, the approximation ratio was known only for the special case of trees. My methods showed that “greedy” routing along shortest paths produces approximately optimal flows.
A lower bound for testing 3-colorability in bounded-degree graphs (with L. Trevisan and A. Bogdanov)
IEEE Symposium on Foundations of Computer Science, November 19, 2002
My coauthors and I proved that 3CNF satisfiability and 3-colorability cannot be tested by sampling-based algorithms. I developed an explicit combinatorial construction of a family of graphs for which every large subgraph is 3-colorable but the entire graph is far from 3-colorable. This established the first linear lower bound for a natural problem in this model.
Optimal lower bounds for 2-query locally decodable linear codes
Randomization and Approximation Techniques in Computer Science, September 13, 2002
I established an asymptotically tight lower bound on the length of any linear code which can be decoded by a randomized procedure which looks at two bits of the code word. This problem was motivated by techniques in private information retrieval and probabilistically checkable proofs.
Method of hosting a first application in a second application
United States 8,763,009 B2, April 15, 2011
Discloses methods for hosting and coordinating execution of virtualized runtime environments within a second host application. As an example, Spoon uses these methods to execute virtual containers within a web browser context.
Method and system for building a streaming model
United States 8,468,175 B2, September 10, 2010
This is one of a series of patents and patents pending related to the use of stochastic techniques to automatically decompose virtual machines into components which can be dynamically selected for transfer based on a model of user behavior.
Method and system for prediction of software data consumption patterns
United States 8,769,051 B2, September 10, 2010
Along with 8,468,175 B2, describes the predictive application streaming and execution algorithms used by the Spoon system to dynamically transfer portions of a virtual machine image based on an adaptive statistical model of an application’s data consumption patterns.
Method and system for building and distributing application profiles via the Internet
United States 8,762,495 B2, September 8, 2010
Describes methods for automatically computing and gathering application execution transcripts via the Internet. Transcripts are used as training data for the machine learning algorithms disclosed in other patents.
Method and system for managing execution of virtual applications
United States 8,782,106 B2, September 2, 2010
This patent describes the management of virtual processes by a runtime engine (called the “sandbox manager” in the Spoon system), particularly coordinating download, caching, and execution of virtual machine container images.
Method and system for virtualization of software applications
United States 8,434,093 B2, August 7, 2008
Describes techniques for automatically virtualizing applications using “templates” which encode configuration and environment information, possibly incorporating OS-dependent logic. These methods help overcome a key obstacle in app virtualization, namely the difficulty in creating virtual application packages.
Adaptive web crawling using a statistical model
United States 7,328,401 B2, December 22, 2004
By using a Poisson-like process to model web content updates, we developed an optimal randomized strategy for selecting content to be queried for updates during web crawls.
Proxy server using a statistical model
United States 7,603,616 B2, November 5, 2004
We observed that our adaptive web crawling method can be applied to reduce network resource consumption by proxy servers. This describes a variant of US7328401 B2 as applied to content access proxies.
Differential cyclic redundancy check
United States 6,601,216 B1, March 31, 2000
I developed a method to increase the efficiency of computing error control information in databases during in-place page updates.