在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):Oulu-IMEDS/KNEEL开源软件地址(OpenSource Url):https://github.com/Oulu-IMEDS/KNEEL开源编程语言(OpenSource Language):Python 98.4%开源软件介绍(OpenSource Introduction):KNEEL: Hourglass Networks for Knee Anatomical Landmark Localization(c) Aleksei Tiulpin, University of Oulu, 2019 AboutApproachIn this paper we tackled the problem of anatomical landmark localization in knee radiographs at all stages of osteoarthritis. We combined recent advances of landmark localization field and distilled them into a novel modification of hourgalss architecture: To train this model, we propose to use mixup, coutout augmentation and dropout and no weight decay. We further propose to use transfer learning from low-cost annotations (knee joint centers on the whole knee radiographs). In the paper, we showed that our transfer learning technique allows to significantly bost the performance. Furthermore, having the models trained to work with the while radiographs and the localized knee joint areas, we were able to build a full pipeline for landmark localization. What's includedThe repository includes the codes for training and testing, annotations for the OAI dataset and also the links to the pre-trained models. How to install and runPreparing the training dataDownload the OAI baseline images from https://nda.nih.gov/oai/. The access to the images is free and painless. You just need to register and provide the information about yourself and agree with the terms of data use. We provide the script and the annotations for creating the cropped ROIs from the original DICOM images.
The annotations are stored in the file Execute the aforementioned script as follows:
Here, After you have created the dataset, you can follow the script Note: you will likely see warnings Reproducing the experiments from the paperAll the experiments done in the paper were made with PyTorch 1.1.0 and anaconda.
To run the experiments, simply copy the content of the folder In order to facilitate reproducibility, conda env file is provided besides the inference Docker files (see below). Inference on your data
In the command above, you need to replace:
Please note that your NVIDIA driver must be compatible with cuda 10. You can also build the docker files yoruself if you want. Running a flask micro-serviceIn addition to CLI inference, we also provide a flask micro-service allowing for integration of KNEEL into data processing pipeline. We have build support for this for both CPU and GPU. To execute the micro-service on cpu, run the following command:
To perform the same on gpu, run the following with
Now, when the microservice is deployed, it is fairly easy to get the landmarks using a python or nodejs script.
Just send a POST request with json having LicenseIf you use the annotations from this work, you must cite the following paper (Accepted to ICCV 2019 VRMI Workshop)
The codes and the pre-trained models are not available for any commercial use including research for commercial purposes. |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论