圈圈学论文(三)基于灰数优势度的灰关联TOPSIS决策方法
由于人们认知的局限性和决策问题的复杂性,决策信息不确定情况下的多属性决策问题广泛地存在于工程和经济管理领域当中,而灰关联分析方法因能够处理"小样本""贫信息"的多属性决策问题,以及具有计算量小、便于与其他方法相融合的优势,在决策信息不确定的决策方法研究中得到广泛应用。本文将介绍基于灰数优势度的灰关联TOPSIS决策方法,降低原始决策信息的灰性,且过程更为简便。
问题描述:
步骤1:对矩阵进行规范化处理
为了消除量纲对最终结果的影响,使不同变量具有可比性,这里运用灰色极差变换方法对矩阵X进行规范化处理,得到无量纲的规范化决策矩阵。记为:
步骤2:求解正负理想方案决策向量
在规范化决策矩阵的基础上,通过基于灰数优势度的比较排序方法求解出正负理想方案决策向量。设正理想方案决策向量为y+,负理想方案决策向量为y-,分别表示为:
由此求出决策矩阵的正负理想方案决策向量y+和y-。
进一步地,根据求出的正负理想方案决策向量,计算出各方案决策属性值的优势度。分别求出正理想方案决策向量中的子因素相对于决策值的优势度,和决策值相对于负理想方案决策向量中的子因素的优势度,简记为正优势度SPD+和负优势度SPD-。正优势度和负优势度的具体计算公式如下:
分别求出所有决策值的正优势度和负优势度后,所有的正优势度构成正优势度矩阵S+,所有的负优势度构成负优势度矩阵S-,分别表示为
接着,需求出正负理想方案决策向量各子因素相对于其自身的优势度。决策值与自身相比,参数都相同,所以正、负理想方案决策向量子因素相对于其自身的优势度都为0,所有正负理想方案子因素相对于自身的优势度分别构成零向量:
步骤3:属性权重的优化
设现实决策问题中决策者已给出的主观权重向量为:
经过优化求解得出的客观权重向量为:
同时考虑所有的正优势度之和最小化与负优势度之和最大化的优化目标,求解出客观权重向量W*,其优化求解模型M1为:
其中,指数a的大小表示决策值正优势度所占的重要程度,满足0≤a≤1,a越大说明其重要程度越高。接着,为兼顾主观权重信息,以主客观权重之间的偏差最小为目标,构建出优化模型M2:
将优化模型M1与M2结合起来,求解决策属性的综合权重向量w,构建出多目标优化模型M:
步骤4:基于优势度的灰关联分析
根据以上计算结果,对正负优势度矩阵进行灰关联处理。当以正理想方案优势度向量R+
作为系统行为序列时,决策值的正优势度越小,说明其距离正理想方案决策向量子因素越近,也就是越接近零时,关联性越大,反之越小;负理想方案优势度同理。根据优势度向量S+,S-分别求出关于零向量R+,R-的接近关联度,并且将属性权重加入其中,得到:
进一步运用TOPSIS的思想将灰色接近关联度进行集结,得出ai的综合接近关联度:
综合接近关联度既能够反映出决策方案的优势程度,又能够从接近性和相似性的角度体现出决策方案与正负理想方案之间的关联性。通过对综合接近关联度进行比较,得出最优方案。
英文学习:
Due to the limitation of people’s cognition and the complexity of decision-making problems, the problem of multi-attribute decision-making under the condition of uncertain decision-making information widely exists in the fields of engineering and economic management, and the grey relational analysis method can deal with "small samples" and "poor". The multi-attribute decision-making problem of "information", as well as its advantages of small amount of calculation and easy integration with other methods, has been widely used in the research of decision-making methods with uncertain decision-making information. This article will introduce the grey relational TOPSIS decision-making method based on the grey number dominance degree, which reduces the greyness of the original decision information, and the process is more convenient.
Problem Description:
Step 1: Normalize the matrix
In order to eliminate the influence of dimensions on the final result and make different variables comparable, the gray range transformation method is used to normalize the matrix X to obtain a dimensionless normalized decision matrix. Recorded as:
Step 2: Solve the decision vector of the positive and negative ideal plan
On the basis of the standardized decision matrix, the positive and negative ideal plan decision vector is solved through the comparison and sorting method based on gray number dominance. Suppose the decision vector of the positive ideal plan is y+, and the decision vector of the negative ideal plan is y-, respectively expressed as:
From this, the positive and negative ideal plan decision vectors y+ and y- of the decision matrix are obtained.
Further, according to the obtained positive and negative ideal plan decision vectors, the dominance of the decision attribute values of each plan is calculated. Calculate the dominance of the sub-factors in the positive ideal plan decision vector relative to the decision value, and the dominance of the decision value relative to the sub-factors in the negative ideal plan decision vector, abbreviated as positive dominance SPD+ and negative dominance SPD -. The specific calculation formulas for positive dominance and negative dominance are as follows:
After calculating the positive dominance and negative dominance of all decision values, all the positive dominance forms the positive dominance matrix S+, and all the negative dominance forms the negative dominance matrix S-, expressed as
Then, it needs to find out the superiority degree of each sub-factor of the decision vector of the positive and negative ideal plan relative to its own. Compared with the decision value, the parameters are the same, so the sub-factors of the positive and negative ideal plan decision vector relative to their own advantages are all 0, and all the positive and negative ideal plan sub-factors relative to their own advantages constitute a zero vector:
Step 3: Optimization of attribute weights
Suppose the subjective weight vector given by the decision maker in the actual decision-making problem is:
The objective weight vector obtained after optimization is:
At the same time, considering all the optimization goals of minimizing the sum of positive dominance and maximizing the sum of negative dominance, the objective weight vector W* is solved, and the optimization solution model M1 is:
Among them, the size of the index a indicates the degree of importance of the positive dominance of the decision value, which satisfies 0≤a≤1, and the larger a, the higher the degree of importance. Then, in order to take into account the subjective weight information, with the goal of minimizing the deviation between the subjective and objective weights, an optimization model M2 is constructed:
Combine the optimization models M1 and M2 to solve the comprehensive weight vector w of decision attributes, and construct a multi-objective optimization model M:
Step 4: Grey relational analysis based on dominance
According to the above calculation results, gray correlation processing is performed on the positive and negative dominance matrix. When using the positive ideal solution dominance vector R+
As a system behavior sequence, the smaller the positive dominance of the decision value, the closer it is to the decision vector sub-factors of the positive ideal plan, that is, the closer to zero, the greater the correlation, and vice versa; the same dominance of the negative ideal plan reason. According to the dominance vectors S+ and S-, we can obtain the close association degrees of the zero vectors R+ and R- respectively, and add the attribute weights to them to obtain:
Further use the idea of TOPSIS to gather the gray proximity relevance to get the comprehensive proximity relevance of ai:
The comprehensive proximity correlation degree can not only reflect the superiority of the decision plan, but also reflect the correlation between the decision plan and the positive and negative ideal plan from the perspective of proximity and similarity. By comparing the comprehensive proximity correlation degree, the optimal solution is obtained.
英文翻译:谷歌翻译;
参考资料:
[1]牛玉飞. 三参数区间灰数信息下的多属性决策方法研究[D].河南农业大学,2018.
本文由LearningYard学苑原创,仅代表个人观点,如有侵权请联系删除。
德云社少东家成上门女婿?郭麒麟用拼刀刀苏宁毅购暴走赘婿看一部电影,说一段故事鱼羊电影张若昀郭麒麟宋轶三人的名字加在一起,让人第一时间想起2019年非常火的一部电视剧庆余年三人在电视剧中搭档扮演范家三兄妹范闲范思辙范若若剧情时而幽默,时
王子文深夜高调秀恩爱,粉丝们哭了,姐姐要幸福王子文,一个来自四川的姑娘,凭借欢乐颂的出色表演,收获很高的人气,获得第19届华鼎奖中国百强电视剧全国观众最喜爱的影视明星奖项,从此家喻户晓!其实她的作品还是蛮多的,出演过灾难片唐
徐克开启新武侠时代的经典之作李连杰黄飞鸿当之无愧看一部电影,说一段故事鱼羊电影广州佛山人杰地灵,自古就是工业商业繁华之地因其独特的地理位置,历来也是兵家必争之地。同时也是中国武术南派的发源地和武术重镇,最为杰出的武术宗师有黄飞鸿
影视剧女王赵丽颖是如何上位的?以及她的几段情感经历她被中国观众公认为影视剧女王,韩国观众则称她为国民妹妹骨子里的清秀美,伴着一点点婴儿肥几段恋情来也匆匆去也匆匆,挥一挥衣袖,不带走半片云彩!演员梦空姐梦受家庭条件的限制被按下暂停键
香港短发女神,遭郭富城疯狂追求,错过两次绝佳机会没能爆红曾经是香港娱乐圈公认的短发美女,长相清纯可爱,眼神中带着淡淡的忧伤15岁出道就成为开心少女组的核心成员,也算是红极一时的人物与她搭档演戏的天王巨星,有很多对她一见倾心可惜她抽烟喝酒
夺得北美票房冠军的R级爽片,为何被骂烂片?温子仁又为何踩雷?凌乱的发型,稚气的面孔,马来西亚华人温子仁身上始终散发出一丝才气他是当今好莱坞最炙手可热的导演,也是票房和口碑的象征。凭借拍摄恐怖片电锯惊魂一举成名,之后的速度与激情7海王再次将事
不要有偏见,去夜店的也是有好女孩的毕业以后,我开始去夜店。说说我作为一个女孩子去夜店的感觉一开始很喜欢,和几个女生朋友经常去,一般从11点开始玩到第二天6点过(一般的夜店是凌晨34点关门,也有五六点的),成都的兰桂
25岁之后必须要懂的人情世故1熟人介绍的工作,能不去尽量别去。2能力越大,屁事越多,该装傻的时候一定要装傻。很多时候,你越是表现自己有多厉害,越是会有很多吃力不讨好的事情找上门,很多人还会因此对你产生防备之心
精神出轨跟肉体出轨哪个伤害大?什么是精神出轨呢?精神出轨是指对伴侣以外的人产生了感情,但肉体还未出轨。精神出轨是相对肉体出轨而言的,两者有本质的区别。比如有个人喜欢某个娱乐圈男明星,喜欢程度超过了自己的丈夫。或
有以下几种行为,说明女人已出轨个人,是社会的一部分。这样说是不够准确的,应该说家庭是组成社会的一个单元,人是组成家庭的一个单元。只有家庭和谐了,社会才能和谐,才能文明进步。已婚女人,通常心思缜密。因为她已经快要
小王周记六我的十一假期总结这个假期不想出去折腾,所以就呆在北京。我以为我的七天假期第一天休息,下午去天安门第二天学习加看电影第三天去北京植物园水杉森林栈道第四天学习去附近公园溜达第五天去园博园参加沉浸式戏曲