Información de la Revista
Applied Computational Intelligence and Soft Computing (ACISC)
https://onlinelibrary.wiley.com/journal/4795
Factor de Impacto:
2.400
Editor:
Hindawi
ISSN:
1687-9724
Vistas:
10901
Seguidores:
0
Solicitud de Artículos
Aims and scope

Applied Computational Intelligence and Soft Computing will focus on the disciplines of computer science, engineering, and mathematics. The scope of the journal includes developing applications related to all aspects of natural and social sciences by employing the technologies of computational intelligence and soft computing. The new applications of using computational intelligence and soft computing are still in development. Although computational intelligence and soft computing are established fields, the new applications of using computational intelligence and soft computing can be regarded as an emerging field, which is the focus of this journal.

The application areas of interest include but are not limited to:

    Interval Analysis (Real Interval Arithmetics, Complex Interval Arithmetics, Interval Equations, etc.)
    Interval Mathematics (Metric Topology for Intervals, Interval Integrals, Interval Differential Equations, etc.)
    Interval Computation (Matrix Computation with Intervals, Systems of Interval Equations, etc.)
    Fuzzy Sets (Fuzzy Numbers, Extension Principle, Fuzzy Rough Sets, Fuzzy Competence Sets, etc.)
    Fuzzy Systems (Fuzzy Control, Fuzzy Neural Networks, Genetic Fuzzy Systems, Hybrid Intelligent Systems, etc.)
    Fuzzy Logics (Many-Valued Logics, Type-2 Fuzzy Logics, Intuitionistic Fuzzy Logics, etc.)
    Fuzzy Mathematics (Fuzzy Differential Equations, Fuzzy Real Analysis, Fuzzy Topology, Fuzzy Algebra, etc.)
    Fuzzy Optimization (Possibilistic Programming, Fuzzy Linear and Nonlinear Programming, Fuzzy Stochastic Optimization, etc.)
    Fuzzy Statistical Analysis (Fuzzy Random Variables, Fuzzy Regression Analysis, Fuzzy Reliability Analysis, Fuzzy Times Series, etc.)
    Operations Research (Fuzzy Games, Fuzzy Inventory Models, Fuzzy Queueing Theory, Fuzzy Scheduling Problems, etc.)
    Heuristics (Ant Colony Optimization, Artificial Immune Systems, Genetic Algorithms, Particle Swarm Intelligence, Simulated Annealing, Tabu Search, etc.)
    Hybrid Systems (The fusion of Fuzzy Systems and Computational Intelligence)
    Approximate Reasoning (Possibility Theory, Mathematical Theory of Evidence, Fuzzy Common Knowledge, etc.)
    Miscellaneous (Fuzzy Data Mining, Fuzzy Biomedical Systems, Pattern Recognition, Fuzzy Clustering, Information Retrieval, Chaotic Systems, etc.)
Última Actualización Por Dou Sun en 2024-08-25
Special Issues
Special Issue on Recent Developments in Artificial Neural Network Structure Optimization and Parameter Initialization
Día de Entrega: 2024-12-27

Description In recent years, significant progress has been made in both theories and applications of artificial neural networks (ANNs) and their hybrid variants. They have been successfully applied to solve real-world problems, such as modeling, adaptive control, and classification. This is possible due to the development of various techniques that tune the parameters of these neural network-based models. These ANN models and their variants often require a large number of parameters and mathematical operations to make accurate predictions, therefore, designing optimal architectures for these networks will speed up their learning process and improve their generalization capability. The other important factor that can affect the accuracy and training time of these models is the initial setting of the parameter values of these models. If they are initialized correctly, they can help reduce training time and improve performance of the model. Neural network pruning is a process of simplifying a neural network by eliminating some of its components, such as weights, activations, or entire layers. The goal of pruning is to reduce the number of parameters, operations, and memory required by the network, without compromising its functionality or accuracy. In some instances, the neural network may begin small but then grows to meet the requirements of the problem – therefore, parameters can be eliminated or included to improve the performance of the network. Neural network pruning-growing and parameter initialization are not easy tasks and can present some challenges and trade-offs. For example, it can be difficult to find the optimal pruning ratio between complexity and functionality, which may require trial-and-error or adaptive methods. Additionally, pruning-growing can affect not only the efficiency, but also the quality and reliability of the network. To evaluate the impact of pruning, rigorous testing and validation may be needed, or benchmarks that capture the trade-offs between speed, size, accuracy, and robustness. The goal of this Special Issue is to gather papers offer novel insights, innovative approaches, and advancements in novel methods for optimizing the structure of neural models and frameworks for the initialization of their parameters to optimize training time. We welcome both original research and review articles. Potential topics include but are not limited to the following: Neural network-based model pruning techniques for efficient model compression using evolutionary and nature inspired algorithms Detailed methodological approaches in evolutionary and nature inspire algorithms for model compression Magnitude and structured pruning methods and their novel combinations Application of structurally optimized neural models in modeling, adaptive control, and time-series forecasting Development of novel methods based on zero, random, and Xavier methods for the model's initialization parameter setting Structure optimization and initialization methods for neural-based models Comparisons of various structural optimization strategies for neural-based models Structurally optimized recurrent neural models such as long short-term memory (LSTM), locally, or fully connected neural network models Application of constructive algorithms to synthesize arbitrarily connected neural networks Combining novel weight initialization methods and constructive-covering algorithms to configure a single feedforward neural network Self-organizing neural-based methods and their stability analysis Editors Lead Editor Rajesh Kumar1 1National Institute of Technology Kurukshetra, Kurukshetra, India Guest Editors Smriti Srivastava1 | Elena Verdu Perez2 1Netaji Subhas University of Technology, India 2Universidad Internacional De La Rioja, Logroño, Spain
Última Actualización Por Dou Sun en 2024-08-25
Revistas Relacionadas
CCFNombre CompletoFactor de ImpactoEditorISSN
aInformation and Computation0.800Elsevier0890-5401
IET Electric Power ApplicationsIET1751-8660
bIEEE/ACM Transactions on Computational Biology and Bioinformatics3.600IEEE/ACM1545-5963
Engineering Analysis with Boundary Elements4.200Elsevier0955-7997
Applied Mathematics & Optimization1.600Springer0095-4616
bIEEE Transactions on Evolutionary Computation14.30IEEE1089-778X
Automation and Remote Control0.600Springer0005-1179
bSoftware Testing, Verification and Reliability1.500John Wiley & Sons, Ltd1099-1689
cInformation Retrieval1.700Springer1386-4564
Computers & Operations Research4.100Elsevier0305-0548
Conferencias Relacionadas
CCFCOREQUALISAbreviaciónNombre CompletoEntregaNotificaciónConferencia
ICICSE'IEEE International Conference on Information Communication and Software Engineering2024-04-012024-04-152024-05-10
ab3VLSI-SoCInternational Conference on VLSI and System-on-Chip2022-05-022022-07-042022-10-03
MEBInternational Conference on Medical Engineering and Bioinformatics2019-11-20 2019-12-14
ICWISEIEEE Conference on Wireless Sensors2018-09-152018-09-302018-11-20
ICITE'International Conference on Intelligent Transportation Engineering2024-08-052024-09-202024-10-18
DNLPInternational Conference on Data Mining and NLP2022-12-102022-12-202023-01-02
ICCAEInternational Conference on Computer and Automation Engineering2024-12-052025-01-012025-03-20
WODES International Workshop on Discrete Event Systems2023-12-112014-02-112024-04-29
EIBDCTInternational Conference on Electronic Information Engineering, Big Data and Computer Technology2025-02-17 2025-02-21
RAMInternational Conference on Robotics, Automation and Mechatronics2015-01-312015-03-312015-07-15
Recomendaciones