Hugo website deployment

Hi, I’m intersested on using hugo but before, I want to know something.

If I understand, we need to find a hosting computer on witch we can install and run hugoServer ? And this is the server that compile in realtime all webpages.

In other world, we cannot just take html files and put them on any (free) static page hosting computer ?

The result is ALL static and can be hosted on any static page hosting computer, like GitHub pages (free for open projects), Amazon S3 or similar.

You must have a computer with Hugo (single binary) to build those static files, these files can then be copied to the host of choice.

1 Like

Oh… that’s awesome !

Thanks for explanations.

I’m also checking Hugo out… after completing the quick start I took the contents of the folder named to “public”, which was created after I ran the hugo command, and copied them to my google drive for testing. But the theme was lost - looks pretty ugly. What exactly is supposed to be copied to the hosting web site?

I have a similar question regarding deployment to godaddy website. Do I need to change the link: http://localhost:1313 generated by hugo in index.html manually? Sorry, if this sounds stupid, but I am new to hugo and coding and I cannot figure out this problem. My configur.yaml file has the address to my website, but hugo still generates the index.html file with the link to localhost:1313.
I appreciate any answer.

To generate the files ready for deployment, you need to run Hugo without the server parameter. Otherwise it generates files made for local development.

1 Like

I’m yet to go live - and this is my first post on this forum - but is the following strategy possible and easy?

A headless web-serving computer sited “under my stairs”,
with Ubuntu, SSH server, Hugo and Apache
Content edited by SSH
Hugo controlled by SSH
Website generated by Hugo 'without the server parameter’
Apache pointing to the public folder.

Or could life be even easier than this?

Whether you host under the stairs or let Amazon/GitHub/other host your files, the “edit by SSH” doesn’t sound optimal.

You will miss out on the preview/livereload from running a local Hugo server.

From then on there are lots of options - push by SSH to a Git repo on your Ubuntu server, with a post commit-hook that builds to the public folder …

bjornerik - I’m not going to argue with someone who knows a thousand-times better than me; and, quite frankly, I will need to do my homework so that I might understand the “post commit-hook” comment - I think the main implication of what you are saying is that ‘posts’ are going to give me extra work. In appreciation for your comment I can only give you my immediate (albeit nuubish) reaction.

I’m not sure that I am missing out. If people are generating their site locally and copying to a preferred host, it seems that a certain number of operations more than I perform might be required for such deployment. HugoServer’s live reload is suberb for redesign, experimentation and otherwise potentially “upsetting” changes. I use HugoServer offline on my normal computer. But, for a simple edit or new blog, Hugo is so blazingly, amazingly fast at regenerating that a SHH edit to the little box under the stairs appears in the browser (via Apache) in a blink.

Also more importantly for a nuub like me, when HugoServer is running, my links become “localhost” (see krakus66 above) and I haven’t yet found the way around this. I posted my situation here for comment because oscar_b seemed to imply that localhost links were inevitable. I think I must be missing something obvious!

I wrote this from my point of view, and each has its own work flow.

When I write content, in most cases – the result will be better if I can see the rendered result before I publish. The faster, the better.

If this isn’t the case with you then it cannot get much simpler than edit with SSH, then via the same SSH hugo when done.

Hi @gungfuzi,

hugo server runs an actual built-in web server, normally on port 1313, but you may set it to anything, even port 80. With that, you don’t even need to have Apache running, and just run:

$ sudo hugo server --watch --baseUrl=http://yoursite.org/ --port=80 \
  --appendPort=false

Note: Run hugo server -h to see a description of all these options. sudo alert! Need to be root to serve at port 80.

Running hugo server this way, the links will point to “http://yoursite.org/” rather than to “http://localhost:1313/”. Yes, you must specify --baseUrl on the command-iine for this use case. You might want to add --disableLiveReload=true too, as otherwise all your webpages will have the LiveReload JavaScript snippet attached, not only for you, but also for all visitors.


I am not sure if you really want to use hugo server though, as you seem to be happy with the Apache setup. You may accomplish the same effect with Apache running on port 80 and serving the public/ directory that Hugo generates, while you also run hugo to watch and refresh the web files after each edit:

$ hugo --watch 

That’s all you need to do to have your website re-generated automatically every time you save a *.md content file in your text editor that you access through SSH, if that is your preferred workflow. :slight_smile:

Cheers,

Anthony

1 Like

Thanks @anthonyfok, that’s me sorted!!

I’ve stopped Apache now and become a conventional.

PS apologies to other posters for not picking up on “at-sign” sooner

Regarding building, running on a standard webserver and automatic deployment I took a quick look at Hugo and have to say Hugo impressed me massive with its portability and cool building workflows.

  • After running hugo locally in the project root all what you need on your Webserver is in the public/ directory

  • Further i was able to implement the deployer (https://deployer.org/docs) within about 10minutes.

Yeah :slight_smile:

2 Likes

I opted not to use a server for hosting my site, for a few reasons. Since Hugo generates static files, you’re able to host it for free on GitHub pages, as well as (for low traffic) on something like AWS S3. I also don’t want to have to worry about maintaining a server, its uptime, and security.

For my personal site, I have a private GitHub repository which contains my site content, and another public repository which contains the theme. When I make a new commit AWS CodeBuild starts the build process and deploys it on S3. This allows me to use my project setup in sublime text on my local machine and if I need to, I can clone the GitHub repository from any other machine and push edits from there.

I personally find running Hugo commands and other deployment commands locally tedious. I wanted to remove as much friction as possible from deployment as possible so I can focus on writing. Also, for security reasons I disable password authentication on any server I set up, so your SSH method also wouldn’t work for me in that regard.

But deploy in whatever way you’re comfortable with, just wanted to share in case you weren’t aware of the server-less approach.

Something that occurred to me: because hugo is a single binary, that facilitates various types of workflow, from local to remote server-based. Imagine doing that with ruby or perl. So painful.

1 Like

I’ve recently started using Hugo too. I also keep my site source in a private repo on GitLab and my template in a public repo on GitHub.

But I’m using the incredible and excellent free hosting on Netlify for now. They are responsive to support questions, friendly and are providing an amazing service. Build is automatic on push to GitLab and informs me what’s going on via Slack. I can see all the build output so I don’t get stuck (like when using Jekyll with GitHub) on an error. And they integrate Let’s Encrypt! Making TLS totally trivial.

Here is the theme I’m slowly working up: GitHub - TotallyInformation/hugo-theme-twenty-sixteen: A blog and information management theme for Hugo roughly based on the Wordpress Twenty Sixteen theme

And here is the site, now live: https://it.knightnet.org.uk - still some rough edges but certainly getting there.

1 Like

I currently host my Hugo site on Azure and I have written a blog post here on how I do it.

Also I have written another post about how I deploy my site to Azure using Jenkins here

If you have any questions, I would be happy to answer them :slight_smile:

1 Like

I am new to hugo and so far am loving it i have designed site but my problems is the deploy, cause i want to deploy to google firebase,when i tried the command hugo && firebase deploy i get this message (‘firebase’ is not recognized as an internal or external command,
operable program or batch file.)please what do you think i could be doing wrong?

I’m curious why you switched from Cloudflare to a Netlify deployment. I saw you recent blog post on that on the frontpage, but would a Cloudflare deployment not be better for Hugo? After all, Netlify doesn’t offer DDOS protection and doesn’t have a WAF add-on. Just curious since you’re into security and probably know this better than me. :slight_smile:

Well, you are correct about Cloudflare’s DDOS protection though it is debatable how much that kicks in with the free tier. However, Netlify is also a CDN so also has multiple endpoints which should provide some protection.

Mainly though, I don’t see the need for another layer when Netlify has let me use Let’s Encrypt (replaces a previous self-cert + Cloudflare SSL) and lets me add all the security headers I probably should have added myself previously but never quite got round to :blush:

To be sure, Cloudflare provides more in the way of its WAF but then you have to pay for much of it and that really isn’t worth it for my meagure blog. Some of what their WAF does is done by security headers which Netlify lets me do anyway. It took me some research but that was worth-while because it brought me up-to-speed with the latest thinking in web security - as an Information Security manager these days more than a practitioner, it can be hard to stay current.

In case you are interested, here are some of the security headers I use:

# Header rules for Netlify

# Prevent embedding & provide XSS protection
# Also preload resources to speed up and unblock rendering
/*
  X-Frame-Options: DENY
  X-XSS-Protection: 1; mode=block
  X-Content-Type-Options: nosniff
  # Control passing of referrer data between pages/sites

  # `strict-origin-when-cross-origin` would be best but is pointless as not well supported
  # `origin` has best mix of security, analytics and support - so use origin as fallback.
  # https://w3c.github.io/webappsec-referrer-policy/#parse-referrer-policy-from-header
  # https://scotthelme.co.uk/a-new-security-header-referrer-policy/
  Referrer-Policy: origin;same-origin;strict-origin-when-cross-origin;

  # Test @ https://report-uri.com/account/reports/csp/
  # Build @ https://report-uri.com/home/generate
  Content-Security-Policy: upgrade-insecure-requests; report-uri https://totallyinfo.report-uri.com/r/d/csp/enforce;

You should note that the CSP listed there is not my final one, that’s just for messing.

1 Like