
Dr. B. Khushalani
Consultant (Research and Education), India
Title: Regolith Modeling on the Polyhedron
Abstract:
Measured Anomaly can be defined as anomaly
between measured gravity field of a polyhedron and the gravity field obtained
by treating the polyhedron as a constant density model. Regolith modelling on
the polyhedron can be done by decomposing the polyhedron into two polyhedra
with a certain truncation radius r from the center. Using this approach, one
can model a regolith of any depth, say 100 meters, on the
polyhedron and obtain the resulting
signature. If ($\rho_{1},v_{1},C_{1}$) and ($\rho_{2},v_{2},C_{2}$) denote the
set (density, volume, gravity co-effs) for the two polyhedra, new gravity
co-efficients can be obtained by constraining total mass.
Biography:
He has done his PhD from the University of Southern California (Los Angeles, USA). His interests lie in the various fields in Applied Mathematics. He has also worked as a consultant for various firms on data analytics. Over the past few years, as a data scientist, he has designed experiments, tested hypothesis, built models, conducted data analysis by building complex algorithms. applied advanced statistical and predictive modeling techniques, worked with vastly varied firms to identify their requirements and suggested appropriate analytical techniques to them, identified relevant data and data sources, developed innovative ways to solve data analytics problems, utilized patterns and volume variations, recommended improvements to methods based on incorporation of new information, communicated with widely different subject experts, educated business professionals on use of new approaches and statistical validation, provided overall metrics to various organizations for their use. His core strengths are in quantitative and qualitative research and analysis, statistical techniques, pattern detection over defined datasets, expertise in building machine learning algorithms and statistical models, proficiency in forecasting/predictive analytics and optimization algorithms, and constantly seeking newer alternative approaches to any given data related problem.