Call Us
+1 336 256 1112
Smart Machine Actively Regularized Tensor Learning in Infinite Family of Feature Space.
We believe there always exists a much smarter solution for any given data science problem, in the infinite feature space. Hence we keep searching for hidden (latent) patterns in the family of infinite feature space using efficient computational approaches along with machine active learning and actively regularized tensor learning.
We define many “if and only if” conditions (IFFs) and analyze the intrinsic “necessary and sufficient” conditions of the (I)nfinite (F)amily of (F)eature space and Machine Learning (ML) models for solving modern any-data (big or small) problems. We also study the asymptotic behavior of the IFF space to derive models that are smart and secure while handling data heterogeneity and scalability problems.
Our goal includes the adaptation of the concept of family of infinite feature space for discovering transformative knowledge from any (big or small) data and developing secure and smart machine learning techniques — machine learning as a master-key — for interdisciplinary applications.
Our solution is the use of advanced high-dimensional computational techniques in infinite feature space. It includes the study and the use of infinite Gaussian mixture models (with Dirichlet distribution and Dirichlet process), Bayesian mixture models (with MCMC – Gibbs sampling and Metropolis-Hastings sampling), Riemannian geometry (with an analysis of divergence characteristics using Hessian and Laplacian), Numerical optimization techniques (with Expectation Maximization, including Nelder-Mead, BFGS, and Trust-Region), uncertainty quantification (Bayesian and Karhunen-Loeve expansions and approximations), and distribution measures (with Rényi divergence and Kullback–Leibler divergence).
It facilitates the efficient use of evidences and the reasoning, as they become available, in defining and updating the uncertain (or latent) parameters of the infinite feature space, and analyzing their dynamic properties.
It promotes the understanding of divergence characteristics of gradient field and improves tensor learning at any state or instances. In other words, it is a major player in conformist and nonconformist characterization.
It helps us visualize the input data at infinity. In other words, with the help of stochastic processes, we can characterize the data at asymptote, while characterizing it in transition. Multi-state Markov chain is the key contributor.
It treats the gradient field representation of data in Riemannian geometry as image signals and enables the adaptation of image analysis tools for improving the quality of data and developing efficient computational approaches.
It helps us define infinite mixture models and infer a family of infinite feature space through evidences and reasoning along with if-and-only-if conditioning of the infinite feature space and machine learning techniques for tensor learning.
It sharpens the process of selecting a small group of machines (sensing algorithms) for promoting tensor learning and making the group smarter. The parallel processing concept is used in this machine action learning.
We call this a “BIRDS” technique, where B stands for Bayesian, I stands for Image, R stands for Riemannian, D stands for Dirichlet, and S stands for Stochastic, that integrates these techniques and presents a hybrid technique that is combined with machine active learning and infinite feature space to solve the challenging data science problems.
Smart Machine Actively Regularized Tensor Learning in Infinite Family of Feature Space.
An approach that integrates perceptual parameters for domain division.
By assigning probabilities for single point in data domain over infinite feature space.
Able to update uncertain parameters in infinite feature space!
Smart Machine Actively Regularized Tensor Learning in Infinite Family of Feature Space.
PhD in Computer Science from Monash University, Australia
Fellow of Royal College of Ophthalmologists
PhD in Vision Science from University of California at Berkeley
Smart Machine Actively Regularized Tensor Learning in Infinite Family of Feature Space.
I graduated in fall 2019! I modeled the asymptotic behavior of fMRI signals for for the development of secure machine learning techniques in neuroscience.
I will study multimodal ophthalmic images and develop an efficient data fusion technique to help machine learning models detect and characterize retinal diseases.
I can do dimensionality reduction easily and find a subspace that can help improve machine learning algorithms. I am still validating the findings using fMRI data.
My goal is to study infinite family of feature spaces and develop smart machine learning algorithms that identify and isolate overlapping features detected in multimodal ophthalmic images.
I search for latent features of retinal diseases within OCT volumetric images with the help of tensor calculus and Riemannian geometry for building smart machine learning models.
Information
We work only on the research problems that are on cutting edge, challenging, and have high intellectual merit and broader impact.
We transform complex and heterogeneous big data to adaptable data!!
We mold the machine learning models and algorithms to become much smarter and secure!!
We discover hidden conformist and nonconformist patterns in the family of infinite feature space!!
We work only on the applications that challenge us! If you think you have an application and its problem will challenge us and you need solution, then contact us!
Information
A multimodal ophthalmic image registration result using BIRDS techniques.
Information
We would like to take this opportunity to thank our current and past funding sources. With their support, we are determined to perform our high quality research with intellectual merit and provide positive impact to the discipline and wider community.
Information
It supports the research project “Next Generation Optogenetics for Vision Restoration.”
Information