Call Us
+1 336 256 1112

We are infinite feature space thinkers!!

Smart Machine Actively Regularized Tensor Learning in Infinite Family of Feature Space. 

SMARTLIFFS RESEARCH GROUP

We believe there always exists a much smarter solution for any given data science problem, in the infinite feature space. Hence we keep searching for hidden (latent) patterns in the family of infinite feature space using efficient computational approaches along with machine active learning and actively regularized tensor learning. 

smartliffs_com_fin

Our Goal

We define many “if and only if” conditions (IFFs) and analyze the intrinsic “necessary and sufficient” conditions of the (I)nfinite (F)amily of (F)eature space and Machine Learning (ML) models for solving modern any-data (big or small) problems. We also study the asymptotic behavior of the IFF space to derive models that are smart and secure while handling data heterogeneity and scalability problems.

Our goal includes the adaptation of the concept of family of infinite feature space for discovering transformative knowledge from any (big or small) data and developing secure and smart machine learning techniques — machine learning as a master-key — for interdisciplinary applications.

OUR SOLUTION

Our solution is the use of advanced high-dimensional computational techniques in infinite feature space. It includes the study and the use of infinite Gaussian mixture models (with Dirichlet distribution and Dirichlet process), Bayesian mixture models (with MCMC – Gibbs sampling and Metropolis-Hastings sampling), Riemannian geometry (with an analysis of divergence characteristics using Hessian and Laplacian), Numerical optimization techniques (with Expectation Maximization, including Nelder-Mead, BFGS, and Trust-Region), uncertainty quantification (Bayesian and Karhunen-Loeve expansions and approximations), and distribution measures (with Rényi divergence and Kullback–Leibler divergence).

Bayesian Inference

It facilitates the efficient use of evidences and the reasoning, as they become available, in defining and updating the uncertain (or latent) parameters of the infinite feature space, and analyzing their dynamic properties.

Riemannian Geometry

It promotes the understanding of divergence characteristics of gradient field and improves tensor learning at any state or instances. In other words, it is a major player in conformist and nonconformist characterization.

Stochastic Processes

It helps us visualize the input data at infinity. In other words, with the help of stochastic processes, we can characterize the data at asymptote, while characterizing it in transition. Multi-state Markov chain is the key contributor.

Image Processing

It treats the gradient field representation of data in Riemannian geometry as image signals and enables the adaptation of image analysis tools for improving the quality of data and developing efficient computational approaches.

Dirichlet Processes

It helps us define infinite mixture models and infer a family of infinite feature space through evidences and reasoning along with if-and-only-if conditioning of the infinite feature space and machine learning techniques for tensor learning.

Machine Action Learning

It sharpens the process of selecting a small group of machines (sensing algorithms) for promoting tensor learning and making the group smarter. The parallel processing concept is used in this machine action learning.

We call this a “BIRDS” technique, where B stands for Bayesian, I stands for Image, R stands for Riemannian, D stands for Dirichlet, and S stands for Stochastic, that integrates these techniques and  presents a hybrid technique that is combined with machine active learning and infinite feature space to solve the challenging data science problems.

PERCEPTUALLY-INSPIRED DEEP LEARNING SOLUTION

Smart Machine Actively Regularized Tensor Learning in Infinite Family of Feature Space.

What?

An approach that integrates perceptual parameters for domain division.

What?

Data Domain Division
How?

By assigning probabilities for single point in data domain over infinite feature space.

How?

Probabilistic Tensor Learning
Why?

Able to update uncertain parameters in infinite feature space!

Why?

Updating Latent Parameters

OUR RESEARCH TEAM

Smart Machine Actively Regularized Tensor Learning in Infinite Family of Feature Space.

Dr. Shan Suthaharan

PhD in Computer Science from Monash University, Australia

Dr. Shan Suthaharan

University of North Carolina at Greensboro
Dr. Kunal Dansingani

Fellow of Royal College of Ophthalmologists

Dr. Kunal Dansingani

University of Pittsburgh School of Medicine
Dr. Ethan Rossi

PhD in Vision Science from University of California at Berkeley

Dr. Ethan Rossi

University of Pittsburgh School of Medicine

OUR FUTURE LEADERS

Smart Machine Actively Regularized Tensor Learning in Infinite Family of Feature Space.

Naseeb Thapaliya

I graduated in fall 2019! I modeled the asymptotic behavior of fMRI signals for for the development of secure machine learning techniques in neuroscience.

Naseeb Thapaliya

Computer Science
Nisha Saini

I will study multimodal ophthalmic images and develop an efficient data fusion technique to help machine learning models detect and characterize retinal diseases.

Nisha Saini

Computer Science
Lavanya Goluguri

I can do dimensionality reduction easily and find a subspace that can help improve machine learning algorithms. I am still validating the findings using fMRI data.

Lavanya Goluguri

Computer Science
Ritu Joshi

My goal is to study infinite family of feature spaces and develop smart machine learning algorithms that identify and isolate overlapping features detected in multimodal ophthalmic images.

Ritu Joshi

Computer Science
Ryan Soorya

I search for latent features of retinal diseases within OCT volumetric images with the help of tensor calculus and Riemannian geometry for building smart machine learning models.

Ryan Soorya

Computer Science
Aparna Muppalla

Information

Aparna Muppalla

Computer Science

RESEARCH PROBLEMS

We work only on the research problems that are on cutting edge, challenging, and have high intellectual merit and broader impact.

Big Data

We transform complex and heterogeneous big data to adaptable data!!

Big Data

Infinite Feature Space
Machine Learning

We mold the machine learning models and algorithms to become much smarter and secure!!

Machine Learning

Infinite Feature Space
Data Science

We discover hidden conformist and nonconformist patterns in the family of infinite feature space!!

Data Science

Infinite Feature Space

CURRENT PROJECTS

We work only on the applications that challenge us! If you think you have an application and its problem will challenge us and you need solution, then contact us!

Computational Neural Science

Information

Computational Vision Science

A multimodal ophthalmic image registration result using BIRDS techniques.

Computational Network Science

Information

CURRENT FUNDING

We would like to take this opportunity to thank our current and past funding sources. With their support, we are determined to perform our high quality research with intellectual merit and provide positive impact to the discipline and wider community.

University of Pittsburgh Medical Center

Information

Fondation Voir et Entendre – Institut de la Vision

It supports the research project “Next Generation Optogenetics for Vision Restoration.”

TBA

Information