在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):flashlight/flashlight开源软件地址(OpenSource Url):https://github.com/flashlight/flashlight开源编程语言(OpenSource Language):C++ 74.9%开源软件介绍(OpenSource Introduction):Flashlight is a fast, flexible machine learning library written entirely in C++ from the Facebook AI Research and the creators of Torch, TensorFlow, Eigen and Deep Speech. Its core features include:
Native support in C++ and simple extensibility makes Flashlight a powerful research framework that enables fast iteration on new experimental setups and algorithms with little unopinionation and without sacrificing performance. In a single repository, Flashlight provides apps for research across multiple domains:
Project LayoutFlashlight is broken down into a few parts:
QuickstartFirst, build and install Flashlight and link it to your own project.
Implementing a simple convnet is easy.#include <flashlight/fl/flashlight.h>
Sequential model;
model.add(View(fl::Shape({IM_DIM, IM_DIM, 1, -1})));
model.add(Conv2D(
1 /* input channels */,
32 /* output channels */,
5 /* kernel width */,
5 /* kernel height */,
1 /* stride x */,
1 /* stride y */,
PaddingMode::SAME; /* padding mode */,
PaddingMode::SAME; /* padding mode */));
model.add(ReLU());
model.add(Pool2D(
2 /* kernel width */,
2 /* kernel height */,
2 /* stride x */,
2 /* stride y */));
model.add(Conv2D(32, 64, 5, 5, 1, 1, PaddingMode::SAME, PaddingMode::SAME));
model.add(ReLU());
model.add(Pool2D(2, 2, 2, 2));
model.add(View(fl::Shape({7 * 7 * 64, -1})));
model.add(Linear(7 * 7 * 64, 1024));
model.add(ReLU());
model.add(Dropout(0.5));
model.add(Linear(1024, 10));
model.add(LogSoftmax()); Performing forward and backward computation is straightforwards: auto output = model.forward(input);
auto loss = categoricalCrossEntropy(output, target);
loss.backward(); See the MNIST example for a full tutorial including a training loop and dataset abstractions.
Autograd Exampleauto A = Variable(fl::rand({1000, 1000}), true /* calcGrad */);
auto B = 2.0 * A;
auto C = 1.0 + B;
auto D = log(C);
D.backward(); // populates A.grad() along with gradients for B, C, and D. Building and InstallingInstall with RequirementsAt minimum, compilation requires:
See the full dependency list for more details if building from source. Instructions for building/installing Python bindings can be found here. Flashlight Build SetupsFlashlight can be broken down into several components as described above. Each component can be incrementally built by specifying the correct build options. There are two ways to work with Flashlight:
Flashlight can be built in one of two ways:
Installing Flashlight with |
Component | Backend | Dependencies |
---|---|---|
libraries | CUDA | CUDA >= 9.2, CUB*† (if CUDA < 11) |
CPU | A BLAS library (Intel MKL >= 2018, OpenBLAS†, etc) | |
core | Any | ArrayFire >= 3.7.3†, an MPI library^(OpenMPI†, etc), cereal*† >= 1.3.0, stb*† |
CUDA | CUDA >= 9.2, NCCL^, cuDNN | |
CPU | oneDNN† >= 2.5.2, gloo (with MPI)*^† | |
app: all | Any | Google Glog†, Gflags† |
app: asr | Any | libsndfile*† >= 10.0.28, a BLAS library (Intel MKL >= 2018, OpenBLAS†, etc), flashlight/text* |
app: imgclass | Any | - |
app: imgclass | Any | - |
app: lm | Any | flashlight/text* |
tests | Any | Google Test (gtest, with gmock)*† >= 1.10.0 |
The Flashlight CMake build accepts the following build options (prefixed with -D
when running CMake from the command line):
Name | Options | Default Value | Description |
---|---|---|---|
FL_BUILD_ARRAYFIRE | ON, OFF | ON | Build Flashlight with the ArrayFire backend. |
ON, OFF | ON | Downloads/builds some dependencies if not found. | |
FL_BUILD_LIBRARIES | ON, OFF | ON | Build the Flashlight libraries. |
ON, OFF | ON | Build the Flashlight neural net library. | |
ON, OFF | ON | Build with distributed training; required for apps. | |
FL_BUILD_CONTRIB | ON, OFF | ON | Build contrib APIs subject to breaking changes. |
FL_BUILD_APPS | ON, OFF | ON | Build applications (see below). |
FL_BUILD_APP_ASR | ON, OFF | ON | Build the automatic speech recognition application. |
FL_BUILD_APP_IMGCLASS | ON, OFF | ON | Build the image classification application. |
FL_BUILD_APP_LM | ON, OFF | ON | Build the language modeling application. |
FL_BUILD_APP_ASR_TOOLS | ON, OFF | ON | Build automatic speech recognition app tools. |
FL_BUILD_TESTS | ON, OFF | ON | Build tests. |
FL_BUILD_EXAMPLES | ON, OFF | ON | Build examples. |
FL_BUILD_EXPERIMENTAL | ON, OFF | OFF | Build experimental components. |
CMAKE_BUILD_TYPE | See docs. | Debug | See the CMake documentation. |
CMAKE_INSTALL_PREFIX | [Directory] | See docs. | See the CMake documentation. |
Flashlight is most-easily linked to using CMake. Flashlight exports the following CMake targets when installed:
flashlight::fl_lib_set
— contains flashlight libraries for headers and symbols pertaining to sets.flashlight::fl_lib_sequence
— contains flashlight libraries for headers and symbols pertaining to sequences.flashlight::fl_lib_audio
— contains flashlight libraries for headers and symbols pertaining to audio.flashlight::flashlight
— contains flashlight libraries as well as the flashlight core autograd and neural network library.flashlight::fl_pkg_runtime
— contains flashlight core as well as common utilities for training (logging / flags / distributed utils).flashlight::fl_pkg_vision
— contains flashlight core as well as common utilities for vision pipelines.flashlight::fl_pkg_text
— contains flashlight core as well as common utilities for dealing with text data.flashlight::fl_pkg_speech
— contains flashlight core as well as common utilities for dealing with speech data.flashlight::fl_pkg_halide
— contains flashlight core and extentions to easily interface with halide.Given a simple project.cpp
file that includes and links to Flashlight:
#include <iostream>
#include <flashlight/fl/flashlight.h>
int main() {
fl::Variable v(fl::full({1}, 1.), true);
auto result = v + 10;
std::cout << "Tensor value is " << result.tensor() << std::endl; // 11.000
return 0;
}
The following CMake configuration links Flashlight and sets include directories:
cmake_minimum_required(VERSION 3.10)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
add_executable(myProject project.cpp)
find_package(flashlight CONFIG REQUIRED)
target_link_libraries(myProject PRIVATE flashlight::flashlight)
vcpkg
Flashlight InstallationIf you installed Flashlight with vcpkg
, the above CMake configuration for myProject
can be built by running:
cd project && mkdir build && cd build
cmake .. \
-DCMAKE_TOOLCHAIN_FILE=[path to vcpkg clone]/scripts/buildsystems/vcpkg.cmake \
-DCMAKE_BUILD_TYPE=Release
make -j$(nproc)
If using a from-source installation of Flashlight, Flashlight will be found automatically by CMake:
cd project && mkdir build && cd build
cmake .. -DCMAKE_BUILD_TYPE=Release
make -j$(nproc)
If Flashlight is installed in a custom location using a CMAKE_INSTALL_PREFIX
, passing -Dflashlight_DIR=[install prefix]/share/flashlight/cmake
as an argument to your cmake
command can help CMake find Flashlight.
Flashlight and its dependencies can also be built with the provided Dockerfiles — see the accompanying Docker documentation for more information.
Contact: [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected]
Flashlight is being very actively developed. See CONTRIBUTING for more on how to help out.
Some of Flashlight's code is derived from arrayfire-ml.
You can cite Flashlight using:
@misc{kahn2022flashlight,
title={Flashlight: Enabling Innovation in Tools for Machine Learning},
author={Jacob Kahn and Vineel Pratap and Tatiana Likhomanenko and Qiantong Xu and Awni Hannun and Jeff Cai and Paden Tomasello and Ann Lee and Edouard Grave and Gilad Avidov and Benoit Steiner and Vitaliy Liptchinsky and Gabriel Synnaeve and Ronan Collobert},
year={2022},
eprint={2201.12465},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
Flashlight is under an MIT license. See LICENSE for more information.
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论