How to stack several MLPautoencoder filter in Multifilter

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

How to stack several MLPautoencoder filter in Multifilter

Marina Santini
Hi,

I am working with text classification and comparing several attribute selection methods.

I am trying to understand how I can benefit from MLPAutoencoder unsupervised filter.
The basic performance of this filter is terribly disappointing because it can only learn autoencoders with one single hidden layer, as stated in one of previous weka's thread (see below):
Wed, 12 Sep 2018,
"Yes, there is the MLPAutoencoder in the multiLayerPerceptrons package. It can only learn auto-encoders with one hidden layer though. If you want to do something more complex, you will need to stack them using MultiFilter. Note that the auto-encoders will be optimised individually, not jointly, if you do that."

My question is: how can I stack several MLPautoencoder filter in Multifilter? Do I have to differentiate the settings?
Look at the picture below: 

image.png



In the experiment in the picture, I stacked three MLPAutoencoders in Multifilter. I only changed the seed number.  

Is this the right way to stack several  MLPAutoencoders?

Actually, the performance of 3 stacked MLPAutoencoders is worse than the performans of 1 single MLPAutoencoder. Is that possible?

Thanks in advance for your answer. 

Cheers, Marina

 


_______________________________________________
Wekalist mailing list
Send posts to: [hidden email]
To subscribe, unsubscribe, etc., visit https://list.waikato.ac.nz/mailman/listinfo/wekalist
List etiquette: http://www.cs.waikato.ac.nz/~ml/weka/mailinglist_etiquette.html
Reply | Threaded
Open this post in threaded view
|

Re: How to stack several MLPautoencoder filter in Multifilter

Eibe Frank-2
Administrator
What you are doing looks fine in principle. Changing the seed is probably not necessary.

However, rather than moving straight to stacked auto-encoders, I would first try to optimise the parameter settings of a single auto-encoder. For example, the default is -N 2, which means the encoder only has two neurons in the hidden layer! Unless your data has a very simple structure, this won’t work. You should also experiment with the regularisation parameter (i.e., the parameter that controls overfitting). This is the “lamba” factor, given by -L. I would vary it on an exponential scale, i.e., try 0.00001, 0.0001, 0.001, etc.

You might also want to try switching from an auto-encoder that uses weight decay to control overfitting (the default) to a contractive auto encoder (via -C).

Cheers,
Eibe

> On 30/01/2019, at 11:14 PM, Marina Santini <[hidden email]> wrote:
>
> Hi,
>
> I am working with text classification and comparing several attribute selection methods.
>
> I am trying to understand how I can benefit from MLPAutoencoder unsupervised filter.
> The basic performance of this filter is terribly disappointing because it can only learn autoencoders with one single hidden layer, as stated in one of previous weka's thread (see below):
> Wed, 12 Sep 2018,
> "Yes, there is the MLPAutoencoder in the multiLayerPerceptrons package. It can only learn auto-encoders with one hidden layer though. If you want to do something more complex, you will need to stack them using MultiFilter. Note that the auto-encoders will be optimised individually, not jointly, if you do that."
>
> My question is: how can I stack several MLPautoencoder filter in Multifilter? Do I have to differentiate the settings?
> Look at the picture below:
>
> <image.png>
>
>
>
> In the experiment in the picture, I stacked three MLPAutoencoders in Multifilter. I only changed the seed number.  
>
> Is this the right way to stack several  MLPAutoencoders?
>
> Actually, the performance of 3 stacked MLPAutoencoders is worse than the performans of 1 single MLPAutoencoder. Is that possible?
>
> Thanks in advance for your answer.
>
> Cheers, Marina
>
>  
>
> _______________________________________________
> Wekalist mailing list
> Send posts to: [hidden email]
> To subscribe, unsubscribe, etc., visit https://list.waikato.ac.nz/mailman/listinfo/wekalist
> List etiquette: http://www.cs.waikato.ac.nz/~ml/weka/mailinglist_etiquette.html

_______________________________________________
Wekalist mailing list
Send posts to: [hidden email]
To subscribe, unsubscribe, etc., visit https://list.waikato.ac.nz/mailman/listinfo/wekalist
List etiquette: http://www.cs.waikato.ac.nz/~ml/weka/mailinglist_etiquette.html
Reply | Threaded
Open this post in threaded view
|

Re: How to stack several MLPautoencoder filter in Multifilter

Marina Santini
Thanks a lot, Eibe!

Cheers, Marina 

On Wed, 30 Jan 2019 at 22:25, Eibe Frank <[hidden email]> wrote:
What you are doing looks fine in principle. Changing the seed is probably not necessary.

However, rather than moving straight to stacked auto-encoders, I would first try to optimise the parameter settings of a single auto-encoder. For example, the default is -N 2, which means the encoder only has two neurons in the hidden layer! Unless your data has a very simple structure, this won’t work. You should also experiment with the regularisation parameter (i.e., the parameter that controls overfitting). This is the “lamba” factor, given by -L. I would vary it on an exponential scale, i.e., try 0.00001, 0.0001, 0.001, etc.

You might also want to try switching from an auto-encoder that uses weight decay to control overfitting (the default) to a contractive auto encoder (via -C).

Cheers,
Eibe

> On 30/01/2019, at 11:14 PM, Marina Santini <[hidden email]> wrote:
>
> Hi,
>
> I am working with text classification and comparing several attribute selection methods.
>
> I am trying to understand how I can benefit from MLPAutoencoder unsupervised filter.
> The basic performance of this filter is terribly disappointing because it can only learn autoencoders with one single hidden layer, as stated in one of previous weka's thread (see below):
> Wed, 12 Sep 2018,
> "Yes, there is the MLPAutoencoder in the multiLayerPerceptrons package. It can only learn auto-encoders with one hidden layer though. If you want to do something more complex, you will need to stack them using MultiFilter. Note that the auto-encoders will be optimised individually, not jointly, if you do that."
>
> My question is: how can I stack several MLPautoencoder filter in Multifilter? Do I have to differentiate the settings?
> Look at the picture below:
>
> <image.png>
>
>
>
> In the experiment in the picture, I stacked three MLPAutoencoders in Multifilter. I only changed the seed number. 
>
> Is this the right way to stack several  MLPAutoencoders?
>
> Actually, the performance of 3 stacked MLPAutoencoders is worse than the performans of 1 single MLPAutoencoder. Is that possible?
>
> Thanks in advance for your answer.
>
> Cheers, Marina
>

>
> _______________________________________________
> Wekalist mailing list
> Send posts to: [hidden email]
> To subscribe, unsubscribe, etc., visit https://list.waikato.ac.nz/mailman/listinfo/wekalist
> List etiquette: http://www.cs.waikato.ac.nz/~ml/weka/mailinglist_etiquette.html


_______________________________________________
Wekalist mailing list
Send posts to: [hidden email]
To subscribe, unsubscribe, etc., visit https://list.waikato.ac.nz/mailman/listinfo/wekalist
List etiquette: http://www.cs.waikato.ac.nz/~ml/weka/mailinglist_etiquette.html