Azure App Gateway Certificate Invalid

Pokud se potýkáte s chybou při vytváření "HTTP Settings" při konfiguraci Azure Application Gateway, je možné, že jste se také nezamysleli nad tím, jaký otisk certifikátu do nějak nahráváte.

Pokud se detailně podíváte na podrobnosti chybové hlášky, uvidíte položku applicationGateways/trustedRootCertificates, jde tedy o potřebu nahrát tam soubor *.cer s kořenovým certifikátem, nikoliv certifikát doménového jména samotný. To vychází z principu, jakým aplikační brána funguje s šifrovanými spojeními, které na ni nejsou terminovány.

Jak tento certifikát získat? Pokud je již na nějakém Windows Serveru nainstalován, pak je třeba otevřít konzoli MMC, přidat Snap-in modul Certifikáty pro Local Computer, nalistovat Trusted Root Certification Autorities, najít ten odpovídající k danému certifikátu (k nalezení v Personal) a exportovat je pravým kliknutím (pod All Tasks) ve formátu Base-64 encoded X.509 (.CER). Celá konfigurace i s vizuálním doprovodem v nápovědě Azure.

Stále je také možnost použít "Use Well Known CA Certificate", pokud jde o skutečný certifikát. Tuto volbu samozřejmě není možné použít pro self signed certifikáty. 


Jenkins error messages solution

Several errors solutions if I need to solve them again:

java.lang.IllegalArgumentException: Empty path not permitted.
 at org.eclipse.jgit.treewalk.filter.PathFilter.create(PathFilter.java:80)
 at org.eclipse.jgit.treewalk.TreeWalk.forPath(TreeWalk.java:205)
 at org.eclipse.jgit.treewalk.TreeWalk.forPath(TreeWalk.java:249)
 at org.eclipse.jgit.treewalk.TreeWalk.forPath(TreeWalk.java:281)
 at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:165)
 at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:159)
 at jenkins.plugins.git.GitSCMFileSystem$3.invoke(GitSCMFileSystem.java:193)
 at org.jenkinsci.plugins.gitclient.AbstractGitAPIImpl.withRepository(AbstractGitAPIImpl.java:29)
 at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.withRepository(CliGitAPIImpl.java:72)
 at jenkins.plugins.git.GitSCMFileSystem.invoke(GitSCMFileSystem.java:189)
 at jenkins.plugins.git.GitSCMFile.content(GitSCMFile.java:159)
 at jenkins.scm.api.SCMFile.contentAsString(SCMFile.java:338)
 at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:110)
 at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:67)
 at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:292)
 at hudson.model.ResourceController.execute(ResourceController.java:97)
 at hudson.model.Executor.run(Executor.java:429)
Finished: FAILURE
The problem was newly created job and "Lightweight checkout" option checked. After unchecking it was working again.

NAV Dynamics login error AAD

Error accessing Website Microsoft Dynamics NAV 2017 Web Client

Raw Url: /XYZ/WebClient/SignIn.aspx?ReturnUrl=%2fXYZ%2fWebClient%2f
Type: Microsoft.Dynamics.Nav.Types.NavSecurityNegotiationException
Message: The Service Principal Name (Delegation) configuration has been set incorrectly. Server connect URL: "net.tcp://localhost:7346/XYZ/Service". SPN Identity: "DynamicsNAV/localhost:7346"
  • The X.509 certificate CN=*.domain.com, O=Company Ltd, L=City, C=CZ is not in the trusted people store.
  • The X.509 certificate CN=*.domain.com, O=Company Ltd, L=City, C=CZ chain building failed. The certificate that was used has a trust chain that cannot be verified. Replace the certificate or change the certificateValidationMode. The revocation function was unable to check revocation because the revocation server was offline.
  • Restart service (I did it with a whole machine)

Go to IIS, open Application Pools, select the Microsoft Dynamics NAV2017 Web Client Application Pool, open Advanced Settings. Find Process Model / Load User Profile and make sure it is False (default is True). (source)

Other related errors and warnings:

This hint didn't help for solving: "The X.509 certificate is not in the trusted people store" to change certificate validation mode. (which config file?)

Background: there is a set Microsoft Azure Active Directory as Service Account for running NAV with deactivated MFA ($Sta = @() and Set-MsolUser -UserPrincipalName $serviceAccountFullName -StrongAuthenticationRequirements $Sta -State "MFA disabled for this user"). SPN is somehow not required for this scenario but it's still mentioned in many places...

setspn -l domain\computerName
    some SPN are registered
setspn -l domain\userAccount
    nothing registered
setspsn -A was unsucessful due to error Failed to assign SPN on account, error 0x2098/8344 -> Insufficient access rights to perform the operation even I had all rights - Global administrator

SPN Identity: DynamicsNAV was created inside Azure as Registrated App


Essay: Page count as academic assignment requirement

The current issue with academic texts, in my opinion, is with a number of people producing this sort of text without having something concrete to say. There are (too) many people for whose is writing academic text condition to do something else like gaining their degree or doing/continuing research. On the other hand, there are undoubtedly also very good texts and ideas which just get lost in quantity of other documents or which just don’t get enough attention due to the timing of publishing or many other reasons. The interconnecting issue is when good ideas get lost in its own length of word quantity. None of the mentioned situations are beneficial for anyone because they are consuming resources on the side of producers and also receivers. 
Writing long text have in my opinion historical context (several hundred years ago) when writing was not done by many people (in the context of the whole population) and it was reserved for a quite small group since it might be considered as “luxury” activity. Mentionable is to also think about the reachability of texts before information technology revolution (including typewriters and similar devices) when reproducing (academical, non-commercial, personal, etc.) texts were an arduous activity or done in bigger amounts based on (probably quite picky) selection. All of these aspects are might be the reason why we are thinking that the length of the text is a good metric to recognize good text.
Another and more concrete reason for defining text length as a metric for writing assignments might be a predictable amount of time which was spent on writing it. We cannot specify how long the other person spent on thinking about the topic but we can say that writing one page takes some specific amount of time and multiply it be required criteria. That’s happening, based on my observations, especially in an academic environment, where there is nonexisting other transparent specification of workload. It’s also possible to explain page length or similar criterium by the most comfortable way to do it for submitter authority and party also by assignee understanding of the task requirements. Are these concerns still up to date nowadays?
There is an ongoing fight for human attention coming from many directions primaries from entertainment and commercial sector represented by ads or various news feeds with personalized content which is mostly more attractive activity, easily consumed and partly addictive to watch or read. Therefore it is designed for being attractive and which brings popularity to platforms where it’s published. Compared to it we have writing methods that are defined with completely opposite parameters to be based on complexity, exhaustingly correctness and as measurable effort spent on it. I want to claim I have nothing against scientific texts without simplifications for masses but the whole point is in defining text length criteria to it.
A well-known writer said through his literary person “A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away.” (Antoine de Saint-Exupéry, 1943) and I as a critical reader of scholarly literature can sign it immediately and emphasizing the fact that it’s visible when any text is not written according to that quote. The mentioned approach should be widely adopted by more academic writers and required by scholarly literature readers to stay focused and not waste everyone’s time to fulfill the criteria of the assignment. There are so many interesting articles and books to read so why to waste time on writing and reading empty sentences.


Jednoduchý proxy server s IPTABLES

Řešil jsem nutnost vytvoření super jednoduchého - jednoúčelového - proxy serveru, kvůli chybějícímu otevření firewallu a jak jinak než urgentnímu požadavku na zprovoznění mimo pracovní dobu.

Použil se na to virtuální stroj s Ubuntu a minimální velikostí. Jako proxy sloužil nástroj iptables, ale dalo by to udělat i s nginx proxy nebo apache2 proxy.

Prvnotní kontrolu prázdných iptables lze provést příkazem 
kde by nemělo být žádné pravidlo

Nejjednodušší je vytvoření skriptu s pravidlem transparentní proxy příkazem
nano natscript.sh

kam přijde text
echo 1 > /proc/sys/net/ipv4/ip_forward
iptables -F
iptables -t nat -F
iptables -X
iptables -t nat -A PREROUTING -p tcp --dport 8080 -j DNAT --to-destination
iptables -t nat -A POSTROUTING -p tcp -d --dport 8080 -j SNAT --to-source

přičemž je cílová adresa, kam se vlastně chceme připojit a je adresa hostitelského serveru, aby se provoz dostal zpátky k odesilateli požadavku. Překlad portu neprobíhá a zůstává na 8080.

Pojďme na to a spustit skript, který založí vytvořená pravidla
sudo chmod +x natscript.sh
sudo ./natscript.sh

Po vykonání tohoto příkazu už bude vidět pravidlo v kontrolním logu s příkazem iptables-save

# Generated by iptables-save v1.6.1
-A PREROUTING -p tcp -m tcp --dport 8080 -j DNAT --to-destination
-A POSTROUTING -d -p tcp -m tcp --dport 8080 -j SNAT --to-source
# Generated by iptables-save v1.6.1

Nyní je potřeba uložit tuto konfiguraci permanentně
sudo su root
sudo apt-get install iptables-persistent

případně při změně
recall dpkg-reconfigure iptables-persistent

Nicméně tato konfigurace stále nezůstane zachována při restartu serveru, je potřeba v souboru 
sudo nano /etc/sysctl.conf
odkomentovat řádek 
a uložit

Namísto směřování požadavků na adresu provádíte nyní dotazování adresy se stejným výsledkem.

Pokud si přejete debugovat pravidla při překladu, doporučuji příkaz  tcpdump
sudo tcpdump dst port 8080 or src port 8080

To by bylo vytvoření jednoduchého proxy serveru obcházející chybějící ACL pravidlo na firewallu.