Industry news

New - NetScaler Gateway (Maintenance Phase) Plug-ins and Clients for Build 12.0-59.8

Netscaler Gateway downloads - Wed, 09/19/2018 - 21:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

Who Said On-Premises Email Was Dead, Look Out Exchange Server 2019 is Here!

Theresa Miller - Tue, 09/18/2018 - 11:23

Well if you haven’t heard Exchange Server 2019 is now in public preview. During Microsoft Ignite 2017 it was announced that Exchange Server 2019 would be coming out in 2018. This announcement put away fears that Exchange Server 2016 would be the last on-premises version. Microsoft came through and released the public preview of Exchange […]

The post Who Said On-Premises Email Was Dead, Look Out Exchange Server 2019 is Here! appeared first on 24x7ITConnection.

New - Latest EPA Libraries

Netscaler Gateway downloads - Fri, 09/14/2018 - 18:30
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

PackMan in practice

The Iconbar - Fri, 09/14/2018 - 08:00
For this first article looking at how to create PackMan/RiscPkg packages, I've decided to use my SunEd program as a guinea pig. Being a simple C application with no dependencies on other packages, it'll be one of the most straightforward things on my site to get working, and one of the easiest for other people to understand.

Read on to discover how to turn simple apps like SunEd into RiscPkg packages, and more importantly, how to automate the process.

Building your first package, the PackIt way

The RiscPkg policy manual is a rather dry document, so the easiest way of getting your first package built is to use the PackIt tool created by Alan Buckley. After loading PackIt, you can simply drag an application to its iconbar icon and up will pop a window to allow you to enter all the extra details that RiscPkg needs to know.

PackIt's package creation wizard

Once everything is filled in correctly, opening the menu and selecting the "Save" option should allow you to save out the resulting package zip file.

PackIt's output

... except that the current version of PackIt seems to save it out with the wrong filetype. No problem, just manually set the type to 'zip' or 'ddc' and things look a lot better:

The actual package content

Pretty simple, isn't it? The !SunEd app has been placed inside an Apps.File directory (mirroring the default install location for the application on the user's hard disc), while the information that was entered into PackIt's wizard has been saved to the RiscPkg.Control and RiscPkg.Copyright files.

The Control and Copyright files

Control is a simple text file containing the package metadata (the structure of which is the subject of much of the RiscPkg policy document), while Copyright is a verbatim copy of the copyright message you entered into PackIt's window.

PackIt's Copyright tab

Now that you have a package built, you can easily test it out by dragging it to PackMan's iconbar icon. PackMan will then go through the usual installation procedure, just as if it was a package you'd selected to install from the Internet.

Loading the package in PackMan

Automating package building

Filling in PackIt's wizard once the first time you create a package for an app is all well and good, but what about when you want to release an update for the package? Entering the information all over again is going to waste your time and introduce the risk of making mistakes.

Most C/C++ developers are already familiar with using makefiles to build their programs. With a bit of effort, it's possible to create makefiles which can also automate creation of the corresponding RiscPkg package.Before

After a brief bit of preparation, the 2003-vintage SunEd sources were tidied up and a simple makefile was written, allowing the application binary to be easily rebuilt on command.

The original SunEd source tree

CFLAGS = -Wall -mpoke-function-name -O2 -mlibscl -mthrowback -static

CC = gcc -c $(CFLAGS) -MMD
LINK = gcc $(CFLAGS)

SRCS =
suned
limp

OBJS = $(addsuffix .o, $(SRCS))

# Output file
!SunEd/!RunImage: $(OBJS)
$(LINK) -o $@ $^ -mlibscl

# Object files
%.o: %.c
$(CC) -MF d/$(basename $@) -o $@ $<

# Dependencies
-include d/*The original SunEd makefile

As a brief overview:

  • c and h contain the source code as you would expect
  • d and o are used for intermediate files: autogenerate dependencies and object files
  • !SunEd is the full app, ready for distribution, and the makefile is only used to rebuild the !RunImage
And after

Rather than bore you with all the intermediate versions, I figured it was best to just jump straight to the final version of the makefile and the adjusted source structure.

The new SunEd source tree

CFLAGS = -Wall -mpoke-function-name -O2 -mlibscl -mthrowback -static

CC = gcc -c $(CFLAGS) -MMD
LINK = gcc $(CFLAGS)

CP = copy
CPOPT = A~CF~NQR~S~T~V

SRCS =
suned
limp

APP = Apps/File/!SunEd

ROAPP = $(subst /,.,$(APP))
OBJS = $(addprefix build/,$(addsuffix .o, $(SRCS)))

# Output file
build/!RunImage: $(OBJS)
$(LINK) -o $@ $^ -mlibscl

# Object files
build/%.o: src/%.c build/dirs
$(CC) -MF build/d/$(subst /,.,$(basename $@)) -o $@ $<

# Pattern rule for injecting version numbers into files
build/%.sed: src/template/% src/Version build/dirs
sed -f src/Version $< > $@

# Explicit dependency needed for generated file build/VersionNum.sed
build/suned.o: build/VersionNum.sed

# Standard clean rule
clean:
remove binary/zip
remove source/zip
x wipe build ~CFR~V

# Binary RiscPkg archive
binary.zip: build/pkg-dir
remove binary/zip
dir build.pkg
zip -rqI9 ^.^.binary/zip *
dir ^.^

# Source zip archive
source.zip: build/src-mani makefile COPYING
remove source/zip
zip -rqI9 source/zip src makefile COPYING

all: binary.zip source.zip

build/dirs:
cdir build
cdir build.o
cdir build.d
create build.dirs

# Double-colon rules execute in the order they're listed. So placing this rule
# here makes sure that the 'build' folder exists prior to the rule below being
# executed.
build/pkg-mani:: build/dirs

# Double-colon rules with no pre-requisites always execute. This allows us to
# make sure that build/pkg-mani is always up-to-date
build/pkg-mani::
src/manigen src.pkg build.pkg-mani

# Same system as build/pkg-mani
build/src-mani:: build/dirs
build/src-mani::
src/manigen src build.src-mani

# Create the package dir ready for zipping
build/pkg-dir: build/pkg-mani build/!RunImage build/Control.sed build/!Help.sed COPYING
# Copy over the static files
x wipe build.pkg ~CFR~V
$(CP) src.pkg build.pkg $(CPOPT)
# Populate the RiscPkg folder
cdir build.pkg.RiscPkg
$(CP) build.Control/sed build.pkg.RiscPkg.Control $(CPOPT)
$(CP) COPYING build.pkg.RiscPkg.Copyright $(CPOPT)
# Populate the app folder
$(CP) build.!Help/sed build.pkg.$(ROAPP).!Help $(CPOPT)
$(CP) build.!RunImage build.pkg.$(ROAPP).!RunImage $(CPOPT)
# Create the dummy file we use to mark the rule as completed
create build.pkg-dir

# Dependencies
-include build/d/*The new SunEd makefile

As you can see, there have been a fair number of changes. Not all of them are strictly necessary for automating package creation (after all, a package is little more than a zip file), but this structure has resulted in a setup that helps to minimise the amount of work I'll need to do when preparing new releases. The setup should also be easily transferrable to the other software I'll be wanting to package.What it does

  • The clean rule reduces things to the state you see above
  • The source.zip rule builds a source archive, containing exactly what you see above
  • The binary.zip rule builds the RiscPkg archive, performing the following operations to get there:
    • A copy of the src.pkg folder is made, in order to provides the initial content of the package zip - essentially, the static files which aren't modified/generated by the build.
    • As you'd expect, the !RunImage file gets built and inserted into the app. But that's not all!
    • The src.Version file is actually a sed script containing the package version number and date:

      s/__UPSTREAM_VERSION__/2.33/g
      s/__PACKAGE_VERSION__/1/g
      s/__RISCOSDATE__/28-Aug-18/gThe src.Version file

      This sed script is applied to src.template.!Help to generate the help file that's included in the package, src.template.Control to generate the RiscPkg.Control file, and src.template.VersionNum. By driving all the version number / date references off of this one file, there won't be any embarrassing situations where a built program will display one version number in one location but another version number in another location.

    • src.template.VersionNum is a C header file, which is used to inject the app version and date into !RunImage.
    • The COPYING file in the root used as the RiscPkg.Copyright file in the package.
  • All the intermediate files will be stored in a build folder, which helps keen the clean and source.zip rules simple.
  • Full dependency tracking is used for both the source.zip and binary.zip targets - adding, removing, or changing any of the files in src.pkg (or anywhere else, for source.zip) will correctly result in the resulting target being rebuilt. This is achieved without introducing any situations where the targets are redundantly built - so a build system which tries to build tens or hundreds of packages won't be slowed down.
manigen

There are also a few extra files. The src.notes folder is a collection of notes from my reverse-engineering of the SunBurst save game format, which I've decided to include in the source archive just in case someone finds it useful. But that's not really relevant to this article.

manigen, on the other hand, is relevant. It's a fairly short and straightforward BASIC program, but it plugs a very large hole in make's capabilities: Make can only detect when files change, not directories. If you have a directory, and you want a rule to be executed whenever the contents of that directory changes, you're out of luck. For small projects like SunEd this isn't so bad, but for bigger projects it can be annoying, especially when all you really want to do with the files is archive them in a zip file.

Thus, manigen ("manifest generator") was born. All it does is recursively enumerate the contents of a directory, writing the filenames and metadata (length, load/exec addr, attributes) of all files to a single text file. However, it also compares the new output against the old output, only writing to the file if a change has been detected.

out%=0
ON ERROR PROCerror

REM Parse command line args
SYS "OS_GetEnv" TO args$
REM First 3 options will (hopefully) be 'BASIC --quit ""'
opt$ = FNgetopt : opt$ = FNgetopt : opt$=FNgetopt
REM Now the actual args
dir$ = FNgetopt
out$ = FNgetopt

DIM result% 1024

out%=OPENUP(out$)
IF out%=0 THEN out%=OPENOUT(out$)
mod%=FALSE

PROCprocess(dir$)
IF EOF#out%=FALSE THEN mod%=TRUE
IF mod% THEN EXT#out%=PTR#out%
CLOSE#out%
REM Oddity: Truncating a file doesn't modify timestamp
IF mod% THEN SYS "OS_File",9,out$
END

DEF PROCprocess(dir$)
LOCAL item%
item%=0
WHILE item%<>-1
SYS "OS_GBPB",10,dir$,result%,1,item%,1024,0 TO ,,,read%,item%
IF read%>0 THEN
n%=20
name$=dir$+"."
WHILE result%?n%<>0
name$=name$+CHR$(result%?n%)
n%+=1
ENDWHILE
PROCwrite(name$+" "+STR$~(result%!0)+" "+STR$~(result%!4)+" "+STR$~(result%!8)+" "+STR$~(result%!12))
IF result%!16=2 THEN PROCprocess(name$)
ENDIF
ENDWHILE
ENDPROC

DEF FNgetopt
LOCAL opt$
opt$=""
WHILE ASC(args$)>32
opt$ = opt$+LEFT$(args$,1)
args$ = MID$(args$,2)
ENDWHILE
WHILE ASC(args$)=32
args$ = MID$(args$,2)
ENDWHILE
=opt$

DEF PROCerror
PRINT REPORT$;" at ";ERL
IF out%<>0 THEN CLOSE#out%
END

DEF PROCwrite(a$)
LOCAL b$,off%
IF EOF#out% THEN mod%=TRUE
IF mod%=FALSE THEN
off%=PTR#out%
b$=GET$#out%
IF a$<>b$ THEN mod%=TRUE : PTR#out%=off%
ENDIF
IF mod% THEN BPUT#out%,a$
ENDPROCmanigen

On Unix-like OS's this is the kind of thing you could knock together quite easily using standard commands like find, ls, and diff. But the built-in *Commands on RISC OS aren't really up to that level of complexity (or at least not without the result looking like a jumbled mess), so it's a lot more sensible to go with a short BASIC program instead.

The usage of manigen in the makefile is described in more detail below.Makefile magic

Looking at each section of the makefile in detail:Pattern rules

# Object files
build/%.o: src/%.c build/dirs
$(CC) -MF build/d/$(subst /,.,$(basename $@)) -o $@ $<

The pattern rule used for invoking the C compiler has changed. Output files are placed in the build directory, and input files come from the src directory. The substitution rule is used to remove the directory separators from the filename that's used for the dependency files, so that they'll all be placed directly in build.d. If they were allowed to be placed in subdirectories of build.d, we'd have to create those subdirectories manually, which would be a hassle.

# Pattern rule for injecting version numbers into files
build/%.sed: src/template/% src/Version build/dirs
sed -f src/Version $< > $@

Another pattern rule is used to automate injection of the package version number and date into files: Any file X placed in src.template can have its processed version available as build.X/sed (or build/X.sed as a Unix path). The sed extension is just a convenient way of making sure the rule acts on the right files.build/dirs

Both of the above rules are also configured to depend on the build/dirs rule - which is used to make sure the build directory (and critical subdirectories) exist prior to any attempt to place files in there:

build/dirs:
cdir build
cdir build.o
cdir build.d
create build.dirs

The file build.dirs is just a dummy file which is used to mark that the rule has been executed.Explicit dependencies

# Explicit dependency needed for generated file build/VersionNum.sed
build/suned.o: build/VersionNum.sed

Although most C dependencies are handled automatically via the -MF compiler flag (and the -include makefile directive), some extra help is needed for build.VersionNum/sed because the file won't exist the first time the compiler tries to access it. By adding it as an explicit dependency, we can make sure it gets generated in time (although it does require some discipline on our part to make sure we keep track of which files reference build.VersionNum/sed)Double-colon rules

# Double-colon rules execute in the order they're listed. So placing this rule
# here makes sure that the 'build' folder exists prior to the rule below being
# executed.
build/pkg-mani:: build/dirs

# Double-colon rules with no pre-requisites always execute. This allows us to
# make sure that build/pkg-mani is always up-to-date
build/pkg-mani::
src/manigen src.pkg build.pkg-mani

Double-colon rules. The manigen program solves the problem of telling make when the contents of a directory have changed, but it leaves us with another problem: We need to make sure manigen is invoked whenever the folder we're monitoring appears in a build rule. The solution for this is double-colon rules, because they have two three very useful properties, which are exploited above:

  1. A double-colon rule with no pre-requisites will always execute (whenever it appears in the dependency chain for the current build target(s)). This is the key property which allows us to make sure that manigen is able to do its job.
  2. You can define multiple double-colon rules for the same target.
  3. Double-colon rules are executed in the order they're listed in the makefile. So by having a rule which depends on build/dirs, followed by the rule that depends on nothing, we can make sure that the build/dirs rule is allowed to create the build folder prior to manigen in the second rule writing its manifest into it.
Of course, we could have just used one build/pkg-mani rule which manually creates the build folder every time it's executed. But the two-rule version is less hacky, and that's kind of the point of this exercise.Creating the package directory

This is a fairly lengthy rule which does a few different things, but they're all pretty simple.

# Create the package dir ready for zipping
build/pkg-dir: build/pkg-mani build/!RunImage build/Control.sed build/!Help.sed COPYING
# Copy over the static files
x wipe build.pkg ~CFR~V
$(CP) src.pkg build.pkg $(CPOPT)
# Populate the RiscPkg folder
cdir build.pkg.RiscPkg
$(CP) build.Control/sed build.pkg.RiscPkg.Control $(CPOPT)
$(CP) COPYING build.pkg.RiscPkg.Copyright $(CPOPT)
# Populate the app folder
$(CP) build.!Help/sed build.pkg.$(ROAPP).!Help $(CPOPT)
$(CP) build.!RunImage build.pkg.$(ROAPP).!RunImage $(CPOPT)
# Create the dummy file we use to mark the rule as completed
create build.pkg-dir

Since there are many situations in which the copy command will not copy, I've wrapped up the right options to use in a variable. Care is taken to specify all the options, even those which are set to the right value by default, just in case the makefile is being used on a system which has things configured in an odd manner.

CP = copy
CPOPT = A~CF~NQR~S~T~V

In this case some of the options are redundant, since this rule completely wipes the destination directory before copying over the new files. But for bigger projects it might make sense to build the directory in a piecemeal fashion, where the extra options are needed.

Once the directory is built, the binary.zip rule can produce the resulting zip file:

# Binary RiscPkg archive
binary.zip: build/pkg-dir
remove binary/zip
dir build.pkg
zip -rqI9 ^.^.binary/zip *
dir ^.^

Note that in this case I could have merged the binary.zip and build/pkg-dir rules together, since build/pkg-dir is only used once. And arguably they should be merged together, just in case I decide to test the app by running the version that's in the build.pkg folder, but it then writes out a log file or something that then accidentally gets included in the zip when I invoke the binary.zip rule later on.

But, on the other hand, keeping the two rules separate means that it's easy to add a special test rule that copes the contents of build.pkg somewhere else for safe testing of the app. And as mentioned above, for big apps/packages it may also make sense to break down build/pkg-dir into several rules, since wiping the entire directory each time may be a bit inefficient.In closing

With a setup like the above, it's easy to automate building of packages for applications. Next time, I'll be looking at how to automate publishing of packages - generating package index files, generating the pointer file required for having your packages included in ROOL's index, and techniques for actually uploading the necessary files to your website.

No comments in forum

Categories: RISC OS

Multi Cloud-Are we all talking about the same Multi Cloud?

Theresa Miller - Thu, 09/13/2018 - 05:30

The latest buzz word of the day is multi cloud and its usage with the enterprise. Lots of confusion and speculation but what does multi cloud really mean? Are we all talking about the same thing when we say Multi cloud? Because there are different cloud services offering types the meaning of multi cloud can […]

The post Multi Cloud-Are we all talking about the same Multi Cloud? appeared first on 24x7ITConnection.

Orpheus hits crowdfunding target

The Iconbar - Tue, 09/11/2018 - 16:26
In July, Orpheus announced their plan to crowdfund their new project.

With their usual modesty, they quietly recently updated their website to say the Company had raised the target figure and work has begun. Excellent news for RISC OS market and for their customers.....

On a personal note, my 6 year old router had issues over the weekend. Richard Brown from Orpheus was on the phone sorting it out at 9am on Saturday morning and helping me to sort out a replacement router asap.....

Orpheus Internet website

No comments in forum

Categories: RISC OS

Your VMworld US 2018 Recap, Announcements and Sessions

Theresa Miller - Tue, 09/11/2018 - 05:30

VMware took the stage once again in Las Vegas in August 2018 as another VMworld came and went which was loaded with announcements and content.  Lots of updates were shared for existing products as well as new products and even a brand new acquisition.  Not only were there lots of technical content and and update […]

The post Your VMworld US 2018 Recap, Announcements and Sessions appeared first on 24x7ITConnection.

RISC OS interview with Jerverm Vermeulen

The Iconbar - Fri, 09/07/2018 - 05:53
This time, it is our pleasure to interview Jerverm Vermeulen, who has just released a RISC OS remake of the old BBC Micro game Dickie Brickie, which is now free on !Store.

Would you like to introduce yourself?
My name is Jeroen Vermeulen and I’m from The Netherlands. Recently I’ve remade the BBC Micro game Dickie Brickie for RISC OS which is available from the PlingStore.

How long have you been using RISC OS?
I’ve used RISC OS way back in the past and only quite recently came back to it. My experience with RISC OS started when I bought a Acorn A3000 in mid 1990. It was followed up with an A4000 which I used until around 1998. I then left the RISC OS scene. Shortly after the Raspberry Pi was introduced and RISC OS was available for it I started to play around with it again. Nothing too serious until mid last year when I decided to pick up programming again and do programming on RISC OS as well. Before I owned an A3000, me and my brother owned a BBC Micro from around 1985.

What other systems do you use?
Windows 10 PC/laptop, Apple iPad.

What is your current RISC OS setup?
RPI 2B with Pi Zero Desktop and SSD. Next to that I use emulators on Windows 10 like RPCEMU, Arculator, VA5000.

What do you think of the retro scene?
I very much love the RISCOS as well as the BBC Micro retro scene. For RISC OS for example I find it amazing what Jon Abbott has been doing with ADFFS. For the BBC Micro I’m finally able to collect programs I once only could read about and have a play with it. Some of the new software that appears for the BBC Micro is extraordinary and I find it very interesting to follow the stardot.org.uk forums with people like kieranhj, tricky, sarahwalker, robc to name but a few doing some wonderful things with the machine and making it work under emulation as well.

Do you attend any of the shows and what do you think of them?
No (not yet), but I follow the show reports via sites like Iconbar and Riscository. When available I even watch some of the show’s videos. I like it the reports/videos are online and they do give some valuable extra/background information if you’ve not been there. As well as put some faces with the names you otherwise only read about 😊

What do you use RISC OS for in 2018 and what do you like most about it?
Programming. I very much like the fact that e.g. AMCOG and Drag’nDrop programs are available and sources are “open” and thus can be studied to learn from. This and the AMCOG Dev Kit allows you to do things that normally would cost more time othwerwise. It’s is the reason why I decided to distribute the sources with the Dickie Brickie game as well, just in case…

Retro kind of things like running games and other programs. On my PC I have an application called LaunchBox which allows RISC OS and BBC Micro programs to be run with a click of a button under emulation. Software/Games that once I could only read about in the Acorn magazines of the time I’m now able to run. For some reason especially with the BBC Micro it was hard to get any software where we lived and we had to make do with programming some of it ourselves or get it by typing in from magazine listings. The latter leading me many years later to remake Dickie Brickie. Back in the day it was a lot of work to type it in, but when we ran it we finally got a glimpse what the machine was capable of with the sprites, sound and animations on display.

What is your favourite feature/killer program in RISC OS?
StrongED & StrongHelp, BBC Basic, Netsurf, ADFFS, ArcEm, BeebIt, InfoZip, AMCOG Dev Kit

What would you most like to see in RISC OS in the future?
Just ongoing developments in general like RISC OS Open is doing with some of the foundations of the system.

Favourite (vaguely RISC OS-releated) moan?
Things can always be better of course, but sometimes I’m just amazed that RISC OS is still around and actively used and developed for. For what I want to do with RISC OS currently – mainly programming – and the fact that I’m still (re-)discovering/learning things I don’t have any complaints

Can you tell us about what you are working on in the RISC OS market at the moment?
I have been working on a remake of a bbc micro game Dickie Brickie. I started remaking it using my own code, but when I learned about the AMCOG Dev Kit I switched over and rewrote most of the game. There is a really nice article on the game at the Riscository site.

Any surprises you can't or dates to tease us with?
I’m investigating a next game to program. I quite like the idea of making a platform game, but I’ve some learning to do on how to do that so it could be a while.

Apart from iconbar (obviously) what are your favourite websites?
Riscository, RISC OS Open (Forums), RISCOS Blog, DragDrop, Stardot (Forums) and some recently discovered websites on programming and game development.

What are your interests beyond RISC OS?
Programming and IT in general.

If someone hired you for a month to develop RISC OS software, what would you create?
That’s a tough question… perhaps some updates to Paint.

Any future plans or ideas you can share with us?
I would like to investigate the use of the DDE and C language.

What would you most like Father Christmas to bring you as a present?
Nothing very special comes to mind. But it would be nice if JASPP would be allowed to disctribute some more games and/or games from the past (e.g. 4th Dimension) would be more easily available.

Any questions we forgot to ask you?
No. Thank you very much for the interview!

No comments in forum

Categories: RISC OS

Storage Sense on Windows 10 configured with Intune

Aaron Parker's stealthpuppy - Sun, 09/02/2018 - 10:46

In a modern management scenario, enabling end-points to perform automatic maintenance tasks will reduce TCO by avoiding scenarios that might result in support calls. Storage Sense in Windows 10 is a great way to manage free disk space on PCs by clearing caches, temporary files, old downloads, Windows Update cleanup, previous Windows Versions, and more, but it it’s not fully enabled by default. Storage Sense can potentially remove gigabytes of data, freeing up valuable space on smaller drives.

Here’s how to enable this feature on Windows 10 PCs enrolled in Microsoft Intune.

Storage Sense Settings

Storage Sense can be found in the Windows 10 Settings app and has only a few settings that can be changed. Typically a user may enable Storage Sense and accept the default settings and for most PCs, the defaults are likely good enough. Here’s what’s available in Windows 10 1803:

Enabling Storage Sense in Windows 10 Settings

Settings are stored in the user profile at:

HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\StorageSense\Parameters\StoragePolicy

 Settings are stored somewhat cryptically with numbers representing various options.

Storage Sense settings in the Registry

These values translate to following options and values in the table below:

SettingRegistry ValueOptionRegistry Data Storage Sense01Off0 On1 Run Storage Sense2048Every Day1 Every Week7 Every Month30 When Windows decides0 Delete temporary files that my apps aren't using04Selected0 Not selected1 Delete files in my recycle bin if they have been there for over08Off0 On1 256Never0 1 day1 14 days14 30 days30 60 days60 Delete files in my Downloads folder if they have been there for over32Off0 On1 512Never0 1 day1 14 days14 30 days30 60 days60

Now that we know what the options are, we can decide on what to deploy and deliver them to enrolled end-points.

Configure via PowerShell

Using the values from the table above, a PowerShell script can be deployed via Intune to configure our desired settings. The script below will enable Storage Sense along with several settings to regularly remove outdated or temporary files.

# Enable Storage Sense # Ensure the StorageSense key exists $key = "HKCU:\SOFTWARE\Microsoft\Windows\CurrentVersion\StorageSense" If (!(Test-Path "$key")) { New-Item -Path "$key" | Out-Null } If (!(Test-Path "$key\Parameters")) { New-Item -Path "$key\Parameters" | Out-Null } If (!(Test-Path "$key\Parameters\StoragePolicy")) { New-Item -Path "$key\Parameters\StoragePolicy" | Out-Null } # Set Storage Sense settings # Enable Storage Sense Set-ItemProperty -Path "$key\Parameters\StoragePolicy" -Name "01" -Type DWord -Value 1 # Set 'Run Storage Sense' to Every Week Set-ItemProperty -Path "$key\Parameters\StoragePolicy" -Name "2048" -Type DWord -Value 7 # Enable 'Delete temporary files that my apps aren't using' Set-ItemProperty -Path "$key\Parameters\StoragePolicy" -Name "04" -Type DWord -Value 1 # Set 'Delete files in my recycle bin if they have been there for over' to 14 days Set-ItemProperty -Path "$key\Parameters\StoragePolicy" -Name "08" -Type DWord -Value 1 Set-ItemProperty -Path "$key\Parameters\StoragePolicy" -Name "256" -Type DWord -Value 14 # Set 'Delete files in my Downloads folder if they have been there for over' to 60 days Set-ItemProperty -Path "$key\Parameters\StoragePolicy" -Name "32" -Type DWord -Value 1 Set-ItemProperty -Path "$key\Parameters\StoragePolicy" -Name "512" -Type DWord -Value 60 # Set value that Storage Sense has already notified the user Set-ItemProperty -Path "$key\Parameters\StoragePolicy" -Name "StoragePoliciesNotified" -Type DWord -Value 1

Modify the script as desired – at the very least the script should enable Storage Sense and leave the remaining settings as default. Save the script as a PowerShell file and deploy via the Intune console in the Azure portal. Ensure that the script runs with the logged on user’s credentials because it will write to HKCU.

Enabling Storage Sense with a PowerShell script in Intune

Assign the script to All Users and their PC will receive the script. It’s important to note that, because the settings are stored in HKCU and are not policies, the user can either disable Storage Sense or change other settings.

Wrapping Up

Storage Sense is a great feature to enable on Windows 10 PCs for both personal and corporate PCs. In a modern management scenario, it’s another tool in our kit for enabling end-points to be self-sufficient, so I highly recommend testing and enabling the feature by default. This article has shown you how to configure Storage Sense via Intune and PowerShell with all of the possible combinations required to configure it to suit your requirements.

Hold On…

Storage Sense shows you how much disk capacity has been cleaned in the previous month in the Settings app. For a bit of a laugh, you can modify the value where this is stored so that Settings displays spaced saved that’s clearly not genuine.

Messing around with the value of saved space

You’ll find the registry value (20180901) in this key:

HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\StorageSense\Parameters\StoragePolicy\SpaceHistory

Image Credit: Photo by Florian Pérennès on Unsplash

This article by Aaron Parker, Storage Sense on Windows 10 configured with Intune appeared first on Aaron Parker.

Categories: Community, Virtualisation

Acorn World at Cambridge computer museum, 8-9th Sept 2018

The Iconbar - Sat, 09/01/2018 - 01:55
Acorn World 2018
Sat 8th & Sun 9th September, 10am-5pm
@ The Centre for Computing History, Cambridge
http://www.computinghistory.org.uk/det/43277/Acorn-World-Exhibition-8th-9th-September-2018/

The Acorn & BBC User Group in association with the Centre for Computing History, Cambridge’s premier computer museum, are pleased to announce Acorn World 2018.

This exhibition will feature machines and software from across Acorn’s history and beyond, showing how they started, the innovative systems produced along the way, and the legacy of successful technology they left behind.

There will be a range of Acorn-era computers on display – and in many cases running for visitors to try out for themselves – covering everything from the System 1, through to the iconic RiscPC – which many recognise as the the pinnacle of Acorn’s computer designs – and beyond, including the never-released Phoebe, and a number of rare prototypes. The vintage displays will also include classic magazines, sure to set those nostalgic flames burning, and software which enthralled, entertained, and educated many users – and even inspired some to go into programming themselves.
Some of those classic computers have been given a new lease of life by enthusiastic users, with modern add-ons and other clever innovations – and there will be a number of these on display as well.

The exhibition doesn’t just stop at machines that came directly from the Acorn stable, though – there will also be post-Acorn systems, including the ultra-cheap Raspberry Pi and at the other end of the scale, the ‘slightly pricier’ Titanium – both of which are themselves children of Cambridge.

Tickets are only £8 for adults, £7 for over 60s, and £6 for children. This includes access to all the museum’s exhibits featuring mainframe, mini, home computers and games consoles from the past 50 years, plus the Guinness World Record holding MegaProcessor. This is a fund raising event for the museum to help continue their important work preserving and archiving computing history.

The Centre for Computing History, Rene Court, Coldhams Rd, Cambridge, CB1 3EW
http://www.computinghistory.org.uk/

No comments in forum

Categories: RISC OS

August News round-up

The Iconbar - Fri, 08/31/2018 - 07:11
Some things we noticed this month. What did you see?

DDE28c update from ROOL.

Prophet Version 3.94 and Font Directory Pro 3.23 now available from Elesar

Orpheus Internet launches a crowdfunding campaign to finance the upgrading of their services. Latest total

It is games month on RISC OS blog!

New 32bit version of Dickie Brickie now on !Store for free.

R-Comp SP12a brings DualMonitor version 5 and lots of RISC OS 5.24 software updates to TiMachine.

The ROOL TCP/IP "phase 1" bounty reaches a major milestone with a beta release of the updated AcornSSL module, supporting the modern TLS protocol instead of the old and insecure SSL protocol.

André Timmermans releases DigitalCD 3.11 and KinoAmp 0.48. The new version of KinoAmp is able to use hardware overlays for improved playback performance on machines and OS versions which support that functionality.

ADFFS 2.68 released. ROOL have also updated their website

IconBar will be running regular articles over the Autumn after a bit of a summer break. We kick off next friday with an interview....

No comments in forum

Categories: RISC OS

New - NetScaler Gateway (Feature Phase) 12.1 Build 49.23

Netscaler Gateway downloads - Thu, 08/30/2018 - 21:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

New - Components for NetScaler Gateway 12.1

Netscaler Gateway downloads - Thu, 08/30/2018 - 18:30
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

New - NetScaler Gateway (Feature Phase) Plug-ins and Clients for Build 12.1-49.23

Netscaler Gateway downloads - Thu, 08/30/2018 - 18:30
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

Everything You Need To Know About VMworld 2018

Theresa Miller - Tue, 08/28/2018 - 05:30

Once again, it is time for VMworld! VMworld 2018 is taking place in Las Vegas at the Mandalay Bay hotel RIGHT NOW! Here is everything you need to know to have a great time. One of the things everyone is most excited about when it comes to VMworld is always the new announcements from VMware. […]

The post Everything You Need To Know About VMworld 2018 appeared first on 24x7ITConnection.

New - Components for NetScaler Gateway 12.0

Netscaler Gateway downloads - Mon, 08/27/2018 - 07:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

New - NetScaler Gateway (Feature Phase) 12.0 Build 58.18

Netscaler Gateway downloads - Mon, 08/27/2018 - 07:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

New - NetScaler Gateway (Feature Phase) Plug-ins and Clients for Build 12.0-58.18

Netscaler Gateway downloads - Mon, 08/27/2018 - 07:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

Workstations in the Cloud? Let’s take a look at Citrix Cloud and VMWare Horizon Cloud

Theresa Miller - Tue, 08/21/2018 - 05:30

VDI workstation virtualization is a common approach for corporate desktop and application access, and it allows IT to centralize information in the corporate datacenter.  These approaches allow for some key business benefits, with the biggest being mobility. Users can work from anywhere and at anytime, allowing for a great amount of flexibility of work teams. […]

The post Workstations in the Cloud? Let’s take a look at Citrix Cloud and VMWare Horizon Cloud appeared first on 24x7ITConnection.

Pages

Subscribe to Spellings.net aggregator