You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 4, 2019. It is now read-only.
A situation with 100 books of 2-3 mb each one, we will have 200 or 300 mb of pdfs. If we have several versions of each one, we could have more than 1GB of content. + database and other files.
if we optimice all of them, we will have much smaller backups.
I believe the most of the size would be because of the images, so maybe is a better idea to optimice the images of the site...
In my situation the original image of the cover page have 0,7 mb and the original exported pdf is 2,5 mb. So, if the optimiced file is 0,5 mb, we can compress 2 mb because of the situation after the including the image
The same books without picture is 0,5mb and optimiced 0,4mb
Description
To be able to have a smaller file
Expected behaviour
To have an optimice fille
Actual behaviour
We have a file that is too big. No automatic proces for the optimization.
Steps to reproduce the problem
To export a file, then go to see the size (2.4mb my example) and after is created must be manually optimice (0,5mb my example).
System Information
The text was updated successfully, but these errors were encountered: