人工智能大作业翻译

时间:2025-04-21

Adaptive Evolutionary Artificial Neural Networks for Pattern Classification 自适应进化人工神经网络模式分类

Abstract—This paper presents a new evolutionary approach called the hybrid evolutionary artificial neural network (HEANN) for simultaneously evolving an artificial neural networks (ANNs) topology and weights. Evolutionary algorithms (EAs) with strong global search capabilities are likely to provide the most promising region. However, they are less efficient in fine-tuning the search space locally. HEANN emphasizes the balancing of the global search and local search for the evolutionary process by adapting the mutation probability and the step size of the weight perturbation. This is distinguishable from most previous studies that incorporate EA to search for network topology and gradient learning for weight updating. Four benchmark functions were used to test the evolutionary framework of HEANN. In addition, HEANN was tested on seven classification benchmark problems from the UCI machine learning repository. Experimental results show the superior performance of HEANN in fine-tuning the network complexity within a small number of generations while preserving the generalization capability compared with other algorithms.

摘要——这片文章提出了一种新的进化方法称为混合进化人工神经网络(HEANN),同时提出进化人工神经网络(ANNs)拓扑结构和权重。进化算法(EAs)具有较强的全局搜索能力且很可能指向最有前途的领域。然而,在搜索空间局部微调时,他们效率较低。HEANN强调全局搜索的平衡和局部搜索的进化过程,通过调整变异概率和步长扰动的权值。这是区别于大多数以前的研究,那些研究整合EA来搜索网络拓扑和梯度学习来进行权值更新。四个基准函数被用来测试的HEANN进化框架。此外,HEANN测试了七个分类基准问题的UCI机器学习库。实验结果表明在少数几代算法中,HEANN在微调网络复杂性的性能是优越的。同时,他还保留了相对于其他算法的泛化性能。

I. INTRODUCTION

Artificial neural networks (ANNs) have emerged as a powerful tool for pattern classification [1], [2]. The optimization of ANN topology and connection weights training are often treated separately. Such a divide-and-conquer approach gives rise to an imprecise evaluation of the selected topology of ANNs. In fact, these two tasks are interdependent and should be addressed simultaneously to achieve optimum results.

人工神经网络(ANNs)已经成为一种强大的工具被用于模式分类[1],[2]。ANN拓扑优化和连接权重训练经常被单独处理。这样一个分治算法产生一个不精确的评价选择的神经网络拓扑结构。事实上,这两个任务都是相互依存的且应当同时解决以达到最佳结果。

One of the key tasks of pattern classification is designing a compact and well-generalized ANN topology. Choosing an appropriate ANN topology for specific problems is critical for ANN generalization because of the strong correlation between the information processing capability and the ANN topology. An excessively small network size suggests that the problem cannot be learned well, whereas an excessively large network size will lead to over-fitting and poor generalization performance. Time-consuming trial-and-error approaches and hill-climbing constructive or pruning algorithms

[3]–[7] used to design an ANN architecture for a given task only explore small architectural subsets and tend to be stopped at structural local optima. The cascaded-correlation neural network [8] is a popular constructive algorithm used to construct ANN topologies that have multiple layers. New hidden nodes are added one by one and are connected with every existing hidden node in the current network. Thus, the network can be seen as having multiple one-unit layers that form a cascade structure. However, the network is prone to structural local optima because of its constructive behavior. Designing an ANN topology using evolutionary algorithms (EAs) has become a popular method to overcome the drawbacks of the constructive or pruning approaches [9]–[13]. EAs, which have a strong global search capability, can effectively search through the near-complete class of ANN topologies.

一个模式分类的关键任务是设计一个紧凑和广义ANN拓扑。为特定的问题选择一个适当的ANN拓扑是至关重要的,由于ANN泛化相关性信息处理能力和ANN拓扑的强关联能力。过度的小型网络的大小表明问题不能学得很好,而一个特别大的网络规模将导致过度学习和差的推广性能。耗时的实验训练方法和爬山建设性或修剪算法[3]-[7]用于设计一个ANN架构,对于一个给定的任务只有探索小型建筑子集,或往往是停在结构局部最优解。相关的神经网络是一个流行的[8]建设性算法而用于构造有多个维层的ANN拓扑。新的隐藏节点一个接一个的被添加进来且都与每一个现有的隐藏节点在当前的网络连接。因此,网络可以被视为拥有多个可以形成一个级联结构的集中度值层。然而,网络是倾向于结构局部最优解由于它具有建设性的行为。设计一个ANN拓扑使用进化算法(EAs)已经成为一种流行的方法来克服建设性或修剪方法[9]-[13]的缺点。它有很强的全局搜索能力,可以有效地搜索通过接近完整的ANN拓扑类。

Much work has been devoted to the evolution of ANN topologies. Two major approaches to evolving ANN topologies reported in the literature are the evolution of ANN topology without weights and the simultaneous evolution of both topology and weights. For the evolution of an ANN topology without weights, the ANN topology has to be trained from a random set of initial weights to evaluate its fitness. Yao and Liu [ …… 此处隐藏:13941字,全部文档内容请下载后查看。喜欢就下载吧 ……

人工智能大作业翻译.doc 将本文的Word文档下载到电脑

    精彩图片

    热门精选

    大家正在看

    × 游客快捷下载通道(下载后可以自由复制和排版)

    限时特价:7 元/份 原价:20元

    支付方式:

    开通VIP包月会员 特价:29元/月

    注:下载文档有可能“只有目录或者内容不全”等情况,请下载之前注意辨别,如果您已付费且无法下载或内容有问题,请联系我们协助你处理。
    微信:fanwen365 QQ:370150219