Prolog is intended primarily as a declarative programming language: the program logic is expressed in terms of relations, represented as facts and rules. A computation is initiated by running a query over these relations. Prolog is well-suited for specific tasks that benefit from rule-based logical queries such as searching databases, voice control systems, and filling templates. In this example we will run a hello world prolog script on bacalhau
Install swipl
%%bash
sudo add-apt-repository ppa:swi-prolog/stable
sudo apt-get update
sudo apt-get install swi-prolog
Comprehensive Prolog implementation with extensive libraries and development tools. Primarily targetted at teaching, RDF processing and web-related tasks, such as creating web services or analysing web content. Official PPAs for SWI-Prolog. See https://www.swi-prolog.org for further information. More info: https://launchpad.net/~swi-prolog/+archive/ubuntu/stable Hit:1 https://cloud.r-project.org/bin/linux/ubuntu bionic-cran40/ InRelease Hit:2 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic InRelease Hit:3 http://archive.ubuntu.com/ubuntu bionic InRelease Get:4 http://security.ubuntu.com/ubuntu bionic-security InRelease [88.7 kB] Ign:5 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 InRelease Hit:6 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 InRelease Hit:7 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 Release Get:8 http://archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB] Hit:9 http://ppa.launchpad.net/cran/libgit2/ubuntu bionic InRelease Hit:10 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu bionic InRelease Get:11 http://archive.ubuntu.com/ubuntu bionic-backports InRelease [83.3 kB] Hit:12 http://ppa.launchpad.net/graphics-drivers/ppa/ubuntu bionic InRelease Hit:13 http://ppa.launchpad.net/swi-prolog/stable/ubuntu bionic InRelease Fetched 261 kB in 2s (141 kB/s) Reading package lists... Hit:1 https://cloud.r-project.org/bin/linux/ubuntu bionic-cran40/ InRelease Ign:2 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 InRelease Hit:3 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 InRelease Hit:4 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 Release Hit:6 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic InRelease Get:7 http://security.ubuntu.com/ubuntu bionic-security InRelease [88.7 kB] Hit:8 http://ppa.launchpad.net/cran/libgit2/ubuntu bionic InRelease Hit:9 http://archive.ubuntu.com/ubuntu bionic InRelease Get:10 http://archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB] Get:11 http://archive.ubuntu.com/ubuntu bionic-backports InRelease [83.3 kB] Hit:12 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu bionic InRelease Hit:13 http://ppa.launchpad.net/graphics-drivers/ppa/ubuntu bionic InRelease Hit:14 http://ppa.launchpad.net/swi-prolog/stable/ubuntu bionic InRelease Fetched 261 kB in 5s (57.4 kB/s) Reading package lists... Reading package lists... Building dependency tree... Reading state information... swi-prolog is already the newest version (8.4.3-0-bionicppa2). The following package was automatically installed and is no longer required: libnvidia-common-460 Use 'sudo apt autoremove' to remove it. 0 upgraded, 0 newly installed, 0 to remove and 5 not upgraded.
Create a file called helloworld.pl
The following script prints ‘Hello World’ to the stdout
%%writefile helloworld.pl
hello_world :- write('Hello World'), nl,
halt.
Overwriting helloworld.pl
Running the script
%%bash
swipl -q -s helloworld.pl -g hello_world
Hello World
After the script has run successfully locally we can now run it on bacalhau
For that we upload our script to IPFS since the script is only present locally ipfs add helloworld.pl
Using IPFS cli
!wget https://dist.ipfs.io/go-ipfs/v0.4.2/go-ipfs_v0.4.2_linux-amd64.tar.gz
!tar xvfz go-ipfs_v0.4.2_linux-amd64.tar.gz
!mv go-ipfs/ipfs /usr/local/bin/ipfs
!ipfs init
!ipfs cat /ipfs/QmYwAPJzv5CZsnA625s3Xf2nemtYgPpHdWEz79ojWnPbdG/readme
!ipfs config Addresses.Gateway /ip4/127.0.0.1/tcp/8082
!ipfs config Addresses.API /ip4/127.0.0.1/tcp/5002
!nohup ipfs daemon > startup.log &
--2022-11-12 08:44:36-- https://dist.ipfs.io/go-ipfs/v0.4.2/go-ipfs_v0.4.2_linux-amd64.tar.gz Resolving dist.ipfs.io (dist.ipfs.io)... 209.94.78.1, 2602:fea2:3::1 Connecting to dist.ipfs.io (dist.ipfs.io)|209.94.78.1|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 7642422 (7.3M) [application/gzip] Saving to: ‘go-ipfs_v0.4.2_linux-amd64.tar.gz.4’ go-ipfs_v0.4.2_linu 100%[===================>] 7.29M --.-KB/s in 0.04s 2022-11-12 08:44:36 (206 MB/s) - ‘go-ipfs_v0.4.2_linux-amd64.tar.gz.4’ saved [7642422/7642422] go-ipfs/build-log go-ipfs/install.sh go-ipfs/ipfs go-ipfs/LICENSE go-ipfs/README.md Error: ipfs daemon is running. please stop it to run this command Use 'ipfs init --help' for information about this command Hello and Welcome to IPFS! ██╗██████╗ ███████╗███████╗ ██║██╔══██╗██╔════╝██╔════╝ ██║██████╔╝█████╗ ███████╗ ██║██╔═══╝ ██╔══╝ ╚════██║ ██║██║ ██║ ███████║ ╚═╝╚═╝ ╚═╝ ╚══════╝ If you're seeing this, you have successfully installed IPFS and are now interfacing with the ipfs merkledag! ------------------------------------------------------- | Warning: | | This is alpha software. Use at your own discretion! | | Much is missing or lacking polish. There are bugs. | | Not yet secure. Read the security notes for more. | ------------------------------------------------------- Check out some of the other files in this directory: ./about ./help ./quick-start <-- usage examples ./readme <-- this file ./security-notes nohup: redirecting stderr to stdout
!ipfs add helloworld.pl
added QmYq9ipYf3vsj7iLv5C67BXZcpLHxZbvFAJbtj7aKN5qii helloworld.pl
Copy the CID of the file which in this case is QmYq9ipYf3vsj7iLv5C67BXZcpLHxZbvFAJbtj7aKN5qii
Since the data Uploaded To IPFS isn’t pinned or will be garbage collected
The Data needs to be Pinned, Pinning is the mechanism that allows you to tell IPFS to always keep a given object somewhere, the default being your local node, though this can be different if you use a third-party remote pinning service.
Using NFTup
Upload files and directories with NFTUp
To upload your dataset using NFTup just drag and drop your directory it will upload it to IPFS
Running on bacalhau
We will use the official swipl docker image
!curl -sL https://get.bacalhau.org/install.sh | bash
Your system is linux_amd64 No BACALHAU detected. Installing fresh BACALHAU CLI... Getting the latest BACALHAU CLI... Installing v0.3.11 BACALHAU CLI... Downloading https://github.com/filecoin-project/bacalhau/releases/download/v0.3.11/bacalhau_v0.3.11_linux_amd64.tar.gz ... Downloading sig file https://github.com/filecoin-project/bacalhau/releases/download/v0.3.11/bacalhau_v0.3.11_linux_amd64.tar.gz.signature.sha256 ... Verified OK Extracting tarball ... NOT verifying Bin bacalhau installed into /usr/local/bin successfully. Client Version: v0.3.11 Server Version: v0.3.11
Command
%%bash --out job_id
bacalhau docker run \
-v QmYq9ipYf3vsj7iLv5C67BXZcpLHxZbvFAJbtj7aKN5qii:/helloworld.pl \
--wait \
--id-only \
swipl \
-- swipl -q -s helloworld.pl -g hello_world
then we will mount the script to the container using the -v flag
-v < CID >:/< name-of-the-script >
Swipl flag
-q running in quiet mode
-s load file as a script in this case we want to run the script helloworld.pl
-g is the name of the function you want to execute in this case its hello_world
%env JOB_ID={job_id}
env: JOB_ID=ddd22db1-61c7-4302-bbe2-f38f8ffa2f44
%%bash
bacalhau list --id-filter ${JOB_ID} --wide
CREATED ID JOB STATE VERIFIED PUBLISHED 22-11-12-08:00:51 ddd22db1-61c7-4302-bbe2-f38f8ffa2f44 Docker swipl swipl -q -s helloworld.pl -g hello_world Completed /ipfs/QmYnaUZLWmbRTJzpx6kgxoAVT3ZAmhqWY6qWZm33v8PjNm
Where it says "Completed", that means the job is done, and we can get the results.
To find out more information about your job, run the following command:
%%bash
bacalhau describe ${JOB_ID}
%%bash
rm -rf results && mkdir -p results
bacalhau get $JOB_ID --output-dir results
Fetching results of job 'ddd22db1-61c7-4302-bbe2-f38f8ffa2f44'... Results for job 'ddd22db1-61c7-4302-bbe2-f38f8ffa2f44' have been written to... results
2022/11/12 08:01:59 failed to sufficiently increase receive buffer size (was: 208 kiB, wanted: 2048 kiB, got: 416 kiB). See https://github.com/lucas-clemente/quic-go/wiki/UDP-Receive-Buffer-Size for details.
Viewing the outputs
%%bash
cat results/combined_results/stdout
Hello World