Torrent details for "Bingi K. Fractional-Order Activation Functions for Neural Networ…" Log in to bookmark
Controls:
×
Report Torrent
Please select a reason for reporting this torrent:
Your report will be reviewed by our moderation team.
×
Report Information
Loading report information...
This torrent has been reported 0 times.
Report Summary:
| User | Reason | Date |
|---|
Failed to load report information.
×
Success
Your report has been submitted successfully.
Checked by:
Category:
Language:
None
Total Size:
15.5 MB
Info Hash:
B9651C46BED16A92753718F6830B30251D0929EB
Added By:
Added:
May 28, 2025, 1:05 p.m.
Stats:
|
(Last updated: May 28, 2025, 1:08 p.m.)
| File | Size |
|---|---|
| Bingi K. Fractional-Order Activation Functions for Neural Networks...2025.pdf | 15.5 MB |
Name
DL
Uploader
Size
S/L
Added
-
15.5 MB
[31
/
15]
2025-05-28
| Uploaded by andryold1 | Size 15.5 MB | Health [ 31 /15 ] | Added 2025-05-28 |
-
210.5 MB
[2
/
4]
2026-03-06
| Uploaded by SkilletWarez | Size 210.5 MB | Health [ 2 /4 ] | Added 2026-03-06 |
-
186.4 MB
[42
/
41]
2025-07-07
| Uploaded by TheExecutive | Size 186.4 MB | Health [ 42 /41 ] | Added 2025-07-07 |
NOTE
SOURCE: Bingi K. Fractional-Order Activation Functions for Neural Networks...2025
-----------------------------------------------------------------------------------
COVER

-----------------------------------------------------------------------------------
MEDIAINFO
Textbook in PDF format This book suggests the development of single and multi-layer fractional-order neural networks that incorporate fractional-order activation functions derived using fractional-order derivatives. Activation functions are essential in neural networks as they introduce nonlinearity, enabling the models to learn complex patterns in data. However, traditional activation functions have limitations such as non-differentiability, vanishing gradient problems, and inactive neurons at negative inputs, which can affect the performance of neural networks, especially for tasks involving intricate nonlinear dynamics. To address these issues, fractional-order derivatives from fractional calculus have been proposed. These derivatives can model complex systems with non-local or non-Markovian behavior. The aim is to improve wind power prediction accuracy using datasets from the Texas wind turbine and Jeju Island wind farm under various scenarios. The book explores the advantages of fractional-order activation functions in terms of robustness, faster convergence, and greater flexibility in hyper-parameter tuning. It includes a comparative analysis of single and multi-layer fractional-order neural networks versus conventional neural networks, assessing their performance based on metrics such as mean square error and coefficient of determination. The impact of using machine learning models to impute missing data on the performance of networks is also discussed. This book demonstrates the potential of fractional-order activation functions to enhance neural network models, particularly in predicting chaotic time series. The findings suggest that fractional-order activation functions can significantly improve accuracy and performance, emphasizing the importance of advancing activation function design in neural network analysis. Additionally, the book is a valuable teaching and learning resource for undergraduate and postgraduate students conducting research in this field
×


