为了账号安全,请及时绑定邮箱和手机立即绑定

Sequential Backward Selection

Sequential Backward Selection

Backward Selection is the selection method starting from the whole set and achieves the attribute set by*** removing the element that results in the maximum decrease of the Objective Function*** in each step.

Sequential Backward Selection Algorithm

  1. Let Y= X.
  2. x in Y where F(x) is maximized.
  3. Y- {xi}, and repeat step 2.

If we run a complete SBS Algorithm, we will have Y=ø, in order to avoid this scenario, we will impose a stopping criterion in practice.

Example:

Apply feature selection on the objective function without a stopping criterion.
F(x1,x2,x3,x4)=3x1x2x3+4x4F(x1,x2,x3,x4)=3*x1*x2-x3+4*x4

[caption id=“attachment_734” align=“alignnone” width=“750”]image

Nerivill / Pixabay[/caption]

Solution:

F(x1,x2,x3,x4)=3x1x2x3+4x4F(x1,x2,x3,x4)=3*x1*x2-x3+4*x4

  1. Check the Objective function value for x1, x2, x3 and x4.

If x1=0, we have F(0,1,1,1)=3
If x2=0, we have F(1,0,1,1)=3
If x3=0, we have F(1,1,0,1)=7
If x4=0, we have F(1,1,1,0)=2

Since x3 produce the maximum decrease value for the objective function, we will remove x3.

2. Check the Objective function value for Y-{x3}
If x1=0, we have F(0,1,0,1)=4
If x2=0, we have F(1,0,0,1)=4
If x4=0, we have F(1,1,0,0)=3

Since x1 and x2 produce the same value, we can pick either x1 or x2. I will pick x1 for simplicity.

3. Check the Objective function value for Y-{x3,x1}
If x2=0, we have F(0,0,0,1)=4
If x4=0, we have F(0,1,0,0)=0
Since x2=0 produce the highest value for the objective function, 4, we will remove x2 in step 3.

4. Check the Objective function value for {x4,x1,x2}∪{x3}
If x4=0, we have F(0,0,0,0)= 0
By finishing this step, we removed the whole set.

Summary:

Sequential Forward Selection is a smart choice to use when the desired cardinality of Y is small. Backward Selection is preferred if the desired cardinality is large.

Both SFS and SBS cannot compare the previous result and the current stage. We need more complicated approaches to resolve this limitation.

Thanks to Douglas Rumbaugh‘s Data Mining Class notes!

Happy studying! 😳

点击查看更多内容
TA 点赞

若觉得本文不错,就分享一下吧!

评论

作者其他优质文章

正在加载中
  • 推荐
  • 评论
  • 收藏
  • 共同学习,写下你的评论
感谢您的支持,我会继续努力的~
扫码打赏,你说多少就多少
赞赏金额会直接到老师账户
支付方式
打开微信扫一扫,即可进行扫码打赏哦
今天注册有机会得

100积分直接送

付费专栏免费学

大额优惠券免费领

立即参与 放弃机会
微信客服

购课补贴
联系客服咨询优惠详情

帮助反馈 APP下载

慕课网APP
您的移动学习伙伴

公众号

扫描二维码
关注慕课网微信公众号

举报

0/150
提交
取消