FPGA architecture for feed-forward sequential memory network targeting long-term time-series forecasting

Read:: - [ ] Orimo et al. (2016) - FPGA architecture for feed-forward sequential memory network targeting long-term time-series forecasting ➕2024-12-19 !!2 rd citation todoist Print::  ❌ Zotero Link:: Zotero Files:: attachment Reading Note:: Web Rip:: url:: https://ieeexplore.ieee.org/document/7857169/citations?tabFilter=papers#citations

TABLE without id
file.link as "Related Files",
title as "Title",
type as "type"
FROM "" AND -"Obsidian Assets"
WHERE citekey = "orimoFPGAArchitectureFeedforward2016" 
SORT file.cday DESC

Abstract

Deep learning is being widely used in various applications, and diverse neural networks have been proposed. A form of neural network, such as the novel feed-forward sequential memory network (FSMN), aims to forecast prospective data by extracting the time-series feature. FSMN is a standard feed-forward neural network equipped with time-domain filters, and it can forecast without recurrent feedback. In this paper, we propose a field-programmable gate-array (FPGA) architecture for this model, and exhibit that the resource does not increase exponentially as the network scale increases.

Quick Reference

Top Notes

Tasks

Topics

Recently, a feed-forward sequential memory network (FSMN) was proposed [6] [7]. It consists of standard feedforward neural network and time-domain filters, and aims at long-term time-series forecasting. tp

Zhang et al. proposed FSMN, which is a feed-forward neural network with time-domain filters and has no recurrent feedbacks in the middle layers for time-series forecast. FSMN can solve the vanishing gradient problem by using its finite memories of information, and can be used for applications that require long-term time-series forecast. tp

FSMN can equip time-domain filters for plural middle layers, and be learnt through backpropagation. tp