Go Features In Structs & Generics
What Is Beauty?
Of the beautiful things I have found in coding, first-class functions are numero uno for me. After that it has to be the map, filter and reduce when using regular, plain-old JavaScript. In fact, even though Python was my first language, I have trouble using it now because of the ergonomics of first-class functions in both Go and JS.
Also lambda is not a replacement in Python and yes it sucks comparatively, please don't.
I still use Python, don't get me wrong, but more sparingly now than before. Go is what I typically want to use and JS is what I end up using after talking with the team. Oh well.
Map, Filter, Reduce
In JavaScript, you can natively use these methods on Array objects. It feels so intuitive and good. I found myself regularly using anonymous functions in Go to attain the same type of effect.
const main = () => {
const arr = [1,2,3,4,5];
someOtherFuncThatNeedsFilteredArray(arr.filter((v) => v > 0))
}
In JS, that feels good, in Go, something like this:
func main() {
arr := []int{1,2,3,4,5}
someOtherFuncThatNeedsFilteredSlice(func() []int{} {
r := make([]int, 0)
for i := range arr {
if arr[i] > 0 {
r = append(r, arr[i])
}
}
return r
}())
}
And I still like it. It can be ugly, I know, but being able to throw in anonymous functions when needed to solve a one-off issues feels really good. Sexy even.
But using generics and structs, you can achieve a similar feeling effect as JavaScript:
func main() {
chainSlice := NewChainSlice([]int{1,2,3,4,5})
someOtherFuncThatNeedsFilteredSlice(chainSlice.Filter(func(v, i int) bool { return v > 0}).Slice())
}
Okay, so no arrow notation or anything like that but still, not bad. To achieve this, just make a simple wrapper around a generic slice, then add the functionality as you would want.
type ChainArray[T any] struct {
arr []T
}
func NewChainArray[T any](data []T) *ChainArray[T] {
c := &ChainArray[T]{arr: make([]T, len(data))}
for i := range data {
c.arr[i] = data[i]
}
return c
}
func (c *ChainArray[T]) Slice() []T {
return c.arr
}
func (c *ChainArray[T]) Filter(f func(T, int) bool) *ChainArray[T] {
result := make([]T, 0)
for i, v := range c.arr {
if f(v, i) {
result = append(result, v)
}
}
c.arr = result
return c
}
Now, we should appreciate just using anonymous functions for this. Making this just for syntactic sugar seems pointless and adds overhead, so I don't recommend. But it's still cool that you can. I'll add the Map and Filter functions too but won't go too in depth with it.
func (c *ChainArray[T]) Map(f func(T, int) T) *ChainArray[T] {
result := make([]T, len(c.arr))
for i, v := range c.arr {
result[i] = f(v, i)
}
c.arr = result
return c
}
func (c *ChainArray[T]) Reduce(f func(T, T, int) T) T {
var curr T
var prev T
var result T
for i := 0; i < len(c.arr)-1; i++ {
curr = c.arr[i+1]
if i == 0 {
prev = c.arr[i]
}
result = f(prev, curr, i)
prev = result
}
return result
}
With the above you could also chain Filter(f()).Map(f())
and it works just fine. Obviously you can also add Reduce(f())
but you cannot chain off that for obvious reasons.
Also I would recommend just using stand alone functions since you don't need to chain super often. This is just a fun thing to play with.
Custom Iterators For Ranging Over Structs
With iterators, you can make a struct method that wraps around the way iterators were implemented within Go. The iter
package is really cool but I am going to just use the match on what is required for range
. You can just use the Seq
from that package or just return a func that matches it's signature. Either way you gotta write that function baby.
Say you have a struct with some sort of internal slices and/or maps to manage something. Building a custom wrapper around exactly how you want to go through data while ranging over it is sooo fun. I'm not super creative so the example I have is iterating through a struct
that contains CSV data and iterates through each column on each row all at once... for some reason. And puts it into a json
. FYI, you would never actually do something like this, there are much better ways, playing around is not always about optimisation.
type CSVStruct struct {
Fields []string
Values [][]string
}
type Iterator[I int, T any] interface {
Iter() func(func(I, T) bool)
}
func (c CSVStruct) Iter() func(func(int, string) bool) {
return func(m func(int, string) bool) {
for i := 0; i < len(c.Values); i++ {
for j := 0; j < len(c.Values[i]); j++ {
// This is yielding values from row (i)
if !m(i, fmt.Sprintf("%s|%s", c.Fields[j], c.Values[i][j])) {
return
}
}
}
}
}
func parseIter(iter Iterator[int, string]) ([]byte, error) {
row := 0
jsonV := make(map[string]string)
total := make([]map[string]string, 0)
for i, v := range iter.Iter() {
if row != i {
row++
total = append(total, jsonV)
jsonV = make(map[string]string)
}
x := strings.Split(v, "|")
jsonV[x[0]] = x[1]
}
return json.Marshal(&total)
}
func main() {
fields := []string{"Year", "Make", "Model"}
values := [][]string{
{"2011", "Mazda", "Miata"},
{"2012", "Hyundai", "Elantra"},
{"2013", "Nissan", "Altima"},
{"2014", "Kia", "Soul"},
{"2015", "Honda", "Accord"},
{"2016", "Toyota", "Highlander"},
{"2017", "Volkswagen", "Tiguan"},
}
csv := CSVStruct{
Fields: fields,
Values: values,
}
x, err := parseIter(csv)
if err != nil {
panic(err)
}
fmt.Println(string(x))
}
The result after running go run . | jq .
:
[
{
"Make": "Mazda",
"Model": "Miata",
"Year": "2011"
},
{
"Make": "Hyundai",
"Model": "Elantra",
"Year": "2012"
},
{
"Make": "Nissan",
"Model": "Altima",
"Year": "2013"
},
{
"Make": "Kia",
"Model": "Soul",
"Year": "2014"
},
{
"Make": "Honda",
"Model": "Accord",
"Year": "2015"
},
{
"Make": "Toyota",
"Model": "Highlander",
"Year": "2016"
}
]
Again, there is no real purpose to how I made this other than I was just playing with the iterators for fun. For actual use, just implement the iter
package rather than doing the above.
Not sure why I'm making this post, It's late and I'm tired. But have fun messing around with Go and JavaScript and never lose your spark for exploring a language.
-jdev