18 skills found
lord / Anchorsself adjusting computations in rust
applied-geodesy / Jag3dJava·Applied·Geodesy·3D - Least-Squares Adjustment Software for Geodetic Sciences
HPSCIL / CuFSDAFcuFSDAF is an enhanced FSDAF algorithm parallelized using GPUs. In cuFSDAF, the TPS interpolator is replaced by a modified Inverse Distance Weighted (IDW) interpolator. Besides, computationally intensive procedures are parallelized using the Compute Unified Device Architecture (CUDA), a parallel computing framework for GPUs. Moreover, an adaptive domain-decomposition method is developed to adjust the size of sub-domains according to hardware properties adaptively and ensure the accuracy at the edges of sub-domains.
limingado / NSCThe code is an implementation of the Nystrӧm-based spectral clustering with the K-nearest neighbour-based sampling (KNNS) method (Pang et al. 2021). It is aimed for individual tree segmentation using airborne LiDAR point cloud data. When using the code, please cite as: Yong Pang, Weiwei Wang, Liming Du, Zhongjun Zhang, Xiaojun Liang, Yongning Li, Zuyuan Wang (2021) Nystrӧm-based spectral clustering using airborne LiDAR point cloud data for individual tree segmentation, International Journal of Digital Earth Code files: ‘segmentation.py’: the main function, including deriving local maximum from Canopy Height Model (CHM); ‘VNSC.py’: other functions for the algorithm, including mean-shift voxelization, similarity graph construction, KNNS sampling, eigendecomposition, k-means clustering, as well as the computation and writing of individual tree parameters. Key parameters: When using the code, users can adjust the values of local maximum window, gap (the upper limit of the number of final clusters), knn (the number of k-nearest neighbours in the similarity graph) and quantile in meanshift method based specific data characteristics. Currently, the value of local maximum window is 3m ×3m, the value of gap is defined as the 1.5 times of the local maximum detected from CHM. Parameter knn can be defined as a constant value (40 in the code) based on the data characteristics, or be determined through the relationship between it and the number of voxels. The default setting of quantile in meanshift method is the average density of point clouds. More details can be found in Pang et al. (2021). Test data: ‘ALS_pointclouds.txt’: point cloud data; ‘ALS_CHM.tif’: CHM of the point cloud data; ‘Reference_tree.csv’: field measurements for algorithm validation. The position was measured using differential GNSS. The tree height of each tree in this file is obtained by regression estimation. Outputs: ‘Data_seg.csv’: coordinate of each point (x, y, z) as well as its cluster label after segmentation; ‘Parameter.csv’: individual tree parameters (TreeID, Position_X, Position_Y, Crown, Height) based on the calculation described in Pang et al. (2021).
jchiquet / AricodeR package for computation of (adjusted) rand-index and other such scores
ocurrent / Current IncrSelf-adjusting computations
JanJereczek / FastIsostasy.jlAccelerated computation of glacial isostatic adjustment for laterally-variable solid-Earth
moradi-coding / Performance And Computational Analysis Of Polarization Adjusted Convolutional PAC CodesPerformance and computational analysis of polarization-adjusted convolutional (PAC) codes; Fano decoding algorithm
cmuparlay / PsacParallel Self-Adjusting Computation
applied-geodesy / Bundle AdjustmentBundle Adjustment for Close-Range Photogrammetry
magnusjonsson / OptiC code generator for incremental / self-adjusting computations
jiedxu / RobustOptimIntroductory (adjustable) robust optimization by matrix computation with box uncertainty and budget of uncertainty.
adityassrana / MCV M4 3DVImplemented image warping, affine and metric rectification, DLT, RANSAC, panorama stitching Zhang’s calibration method, view morphing, stereo matching, depth-map computation, bundle adjustment and resectioning for Structure from Motion
Samudraneel-98 / Cancer Subtype Prediction Using Multi Omics DatasetImportance of Cancer Subtype prediction: Cancer is a heterogeneous disease caused by chemical, physical, or genetic factors. Identification of cancer subtypes is of great importance to facilitate cancer diagnosis and therapy. Bioinformatics approaches have gradually taken the place of clinical observations and pathological experiments. The development of high-throughput genome analysis techniques on the research of cancer subtypes plays an important role in the analysis and clinical treatment of various kinds of cancers. Omics dataset: The process of mapping and sequencing the human genome began, new technologies have made it possible to obtain a huge number of molecular measurements within a tissue or cell. These technologies can be applied to a biological system of interest to obtain a snapshot of the underlying biology at a resolution that has never before been possible. Broadly speaking, the scientific fields associated with measuring such biological molecules in a high-throughput way are called omics.Omics are novel, comprehensive approaches for analysis of complete genetic or molecular profiles of humans and other organisms. the types of omics data that can be used to develop an omics-based test are discussed below: genomics, proteomics, transcriptomics and metabolomics. Importance of Omics Data with respect to Cancer Prediction: Accurately predicting cancer prognosis is necessary to choose precise strategies of treatment for patients. One of effective approaches in the prediction is the integration of multi-omics data, which reduces the impact of noise within single omics data. A number of methods have been proposed to integrate multi-sources data to identify cancer subtypes in recent years.Based on these types of expression data, various computational methods have been proposed to predict cancer subtypes. It is crucial to study how to better integrate these multiple profiles of data. Approaches of omics data concatenation: 1.Integrative NMF 2.Similarity Network Fusion 3.Joint Non Negative Matrix Factorization Deep Learning: Deep learning is an artificial intelligence (AI) function that imitates the workings of the human brain in processing data and creating patterns for use in decision making. Deep learning is a subset of machine learning in artificial intelligence that has networks capable of learning unsupervised from data that is unstructured or unlabeled. Hyperparameter tuning: Hyperparameter tuning works by running multiple trials in a single training job. Each trial is a complete execution of your training application with values for your chosen hyperparameters, set within limits you specify. The AI Platform Training training service keeps track of the results of each trial and makes adjustments for subsequent trials. When the job is finished,you can get a summary of all the trials along with the most effective configuration of values according to the criteria you specify.
Meshaal-Mouawad / Implementing Machine Learning To Analyze Humans Walkings Extracted From Micro Doppler RadarThree experiments were examined in this technical report for digital signal processing. In experiment 1, we analyzed and examined how the fast Fourier transform (FFT) resolution was affected by adjusting the filter length, the filter type, and the FFT order. An FFT Physical resolution and a Computational Resolution were discussed in experiment 1. In experiment 2, the data was given in MATLAB and we investigated the signals concurrently in time and frequency using the Short-time Fourier transform (STFT). We estimated the sampling rate and briefly discussed the Short-time Fourier transform (STFT) and the FFT in terms of how different they are. The Short-time Fourier transform (STFT) and Support Vector Machine (SVM) were discussed in experiment 3. We implemented an SVM to estimate four walking speeds using features extracted from simulated micro-doppler radar for walking humans. Figures of the STFT magnitude of each class are presented and discussed the basics of Doppler Effect as well as micro-doppler. We examined the results using a confusion matrix for training and testing and the best results are presented. MATLAB and related tools were used in this project and the code is provided in the report.
phisan-chula / PyBundleBlockPerform bundle block adjustment computation using 'lmfit' The program benefits modern and convenient pythonic paramdigm.
vefghmhassan / Stl VolumeThe Go code snippet calculates volume, weight, and density of 3D models from STL files, supporting both ASCII and binary formats. It features `STLCalc` for file analysis, incorporating methods for reading, volume computation, and density adjustment, useful in 3D printing and CAD applications.
daijingixn / Performance Evaluation Of PAC Decoding With Deep Neural NetworksPerformance and computational analysis of polarization-adjusted convolutional (PAC) decoder with deep neural netwrok; MLP, CNN, RNN. You may reference the following works if you are using our implementation. You are welcome to contact me for further discussion or cooperation.